Lorven Technologies Inc.

AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in Newark, New Jersey, with a long-term contract and a pay rate of "unknown." Requires a Bachelor's degree, 10-12+ years of IT experience, proficiency in AWS services, strong SQL skills, and expertise in ETL processes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Computer Science #Data Pipeline #"ETL (Extract #Transform #Load)" #Redshift #Data Analysis #Data Warehouse #Scala #Programming #Business Analysis #SQL (Structured Query Language) #Data Storage #SAP #Compliance #Data Science #DMS (Data Migration Service) #Data Architecture #AWS (Amazon Web Services) #Data Processing #Data Integrity #Data Modeling #Java #Data Extraction #Data Engineering #Security #Storage #Python #Cloud #Monitoring #Data Security #S3 (Amazon Simple Storage Service) #Data Quality
Role description
Hi, Our client is looking AWS Data Engineer for Long term project in Newark, New Jersey below is the detailed requirements. Job Title : AWS Data Engineer Location : Newark, New Jersey Duration : Long term Job description: • Bachelor's degree in Computer science or equivalent, with minimum 10 -12+ Years of IT experience. • Design, build, and maintains data processing systems and pipelines on the AWS cloud platform utilizing AWS services like S3, Redshift, DMS, and Glue like to construct scalable, reliable, and efficient data solutions. Responsibilities include data modeling, ETL processes, and ensuring data quality and security. • Must have a Data Architecture and Infrastructure: Design, implement, and maintain data pipelines, data warehouses, and other data solutions using AWS services. • ETL Processes: Develop and implement ETL (Extract, Transform, Load) processes for data extraction, transformation, and loading into data storage systems. • Data Modeling: Create and manage data models to ensure data integrity and facilitate efficient data analysis. • Data Security and Compliance: Implement and maintain data security and compliance measures, including access controls, encryption, and data masking. • Data Quality: Ensure data quality, accuracy, and consistency through data validation, cleansing, and monitoring. • Collaboration: Collaborate with data scientists, business analysts, SAP functional SMEs and other stakeholders to understand requirements and deliver data solutions that meet business needs. • Troubleshooting: Diagnose and resolve data-related issues and performance bottlenecks. • Keep up with the latest AWS services and data engineering best practices. • AWS Services: Proficiency in AWS services such as S3, Redshift, DMS, Glue, and Lambda. • Data Warehousing: Experience with data warehousing concepts and technologies. • SQL: Strong SQL skills for data querying and manipulation. • Programming Languages: Proficiency in programming languages like Python or Java. • Data Modeling: Understanding of data modeling principles and techniques. • ETL Processes: Experience with ETL (Extract, Transform, Load) processes and tools. • Data Security: Knowledge of data security best practices and compliance requirements. • Problem-Solving: Strong problem-solving and analytical skills. • Communication: Excellent communication and collaboration skills. • Good communication skills, team spirit and attitude to learn new technologies • Contributes to the technical design for development projects by reviewing and understanding the high level functional specifications.