

W2 Only :: Data Engineer (ETL, AWS, DMS, Data Pipelines, Cloud)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (ETL, AWS, DMS, Data Pipelines, Cloud) in Princeton, NJ, on a long-term contract. Requires 3-7 years of data engineering experience, proficiency in SQL and Python, and expertise in AWS services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 3, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Princeton, NJ
-
π§ - Skills detailed
#Spark (Apache Spark) #Airflow #AWS DMS (AWS Database Migration Service) #AWS (Amazon Web Services) #Kafka (Apache Kafka) #S3 (Amazon Simple Storage Service) #DMS (Data Migration Service) #Cloud #Redshift #Data Modeling #Data Pipeline #Data Lake #PostgreSQL #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Databases #MySQL #SQL (Structured Query Language) #Python #Lambda (AWS Lambda) #Terraform #DynamoDB #Scala #Computer Science #Data Engineering #NoSQL #Data Manipulation #Apache Spark
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title : Data Engineer (ETL, AWS, DMS, Data Pipelines, Cloud)
Location : Princeton, NJ(Onsite)
Long Term Contract
Job Description:
Required Skills & Qualifications:
β’ Bachelorβs/masterβs degree in computer science, Data Engineering, or a related field.
β’ 3-7 years of experience in ETL development and data engineering.
β’ Strong proficiency in SQL, Python, or Scala for data manipulation.
β’ Expertise in AWS cloud services (DMS, Glue, Redshift, S3, Lambda, Step Functions, EMR).
β’ Experience working with relational and NoSQL databases (PostgreSQL, MySQL, DynamoDB, etc.).
β’ Hands-on experience with Apache Spark, Kafka, or Airflow is a plus.
β’ Knowledge of data warehousing concepts and data lake architectures.
β’ Strong understanding of data modeling, performance tuning, and query optimization.
β’ Experience with CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation) is a plus