

Cyberobotix
AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in Berkeley Heights, NJ, on a W2 contract for 10+ years of experience. Key skills include AWS services, ETL development, SQL, Python/Scala, and big data technologies. AWS certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Berkeley Heights, NJ
-
🧠 - Skills detailed
#Databricks #GIT #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #Data Quality #Data Lake #Scala #AWS Glue #Data Science #Big Data #Agile #Data Modeling #Python #AWS (Amazon Web Services) #Amazon RDS (Amazon Relational Database Service) #Compliance #Data Pipeline #AWS Lambda #Cloud #Athena #Hadoop #IAM (Identity and Access Management) #Snowflake #"ETL (Extract #Transform #Load)" #Data Engineering #Data Analysis #Datasets #Spark (Apache Spark) #RDS (Amazon Relational Database Service) #Redshift #SQL (Structured Query Language) #Amazon EMR (Amazon Elastic MapReduce) #Data Warehouse #Jenkins #DevOps #Amazon Redshift #Security #Programming
Role description
ob Title: AWS Data Engineer
Location: Berkeley Heights, New Jersey, United States
Type: [w2-Contract]
Experience: [10+ Years]
Job Summary:
We are seeking a skilled AWS Data Engineer with strong experience in building, optimizing, and maintaining scalable data pipelines on AWS. The ideal candidate will have expertise in cloud-based data solutions, ETL development, and big data technologies.
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines using AWS services
• Build and optimize ETL/ELT processes for structured and unstructured data
• Work with large datasets in data lakes and data warehouses
• Implement data models and ensure data quality and integrity
• Collaborate with data analysts, data scientists, and stakeholders
• Monitor and troubleshoot data workflows and performance issues
• Implement security and compliance best practices in AWS environments
• Automate data processes using CI/CD pipelines
Required Skills & Qualifications:
• 5+ years of experience in Data Engineering
• Strong hands-on experience with AWS services such as:
• Amazon S3
• AWS Glue
• Amazon Redshift
• Amazon EMR
• AWS Lambda
• Amazon RDS
• AWS Athena
• Experience with ETL tools and data pipeline development
• Strong SQL and Python/Scala programming skills
• Experience with big data technologies (Spark, Hadoop)
• Knowledge of data modeling and data warehousing concepts
• Experience with DevOps tools (CI/CD, Jenkins, Git)
• Good understanding of IAM roles, security, and networking in AWS
Preferred Qualifications:
• AWS Certified Data Analytics or AWS Certified Solutions Architect
• Experience with Snowflake or Databricks
• Experience working in Agile environments
ob Title: AWS Data Engineer
Location: Berkeley Heights, New Jersey, United States
Type: [w2-Contract]
Experience: [10+ Years]
Job Summary:
We are seeking a skilled AWS Data Engineer with strong experience in building, optimizing, and maintaining scalable data pipelines on AWS. The ideal candidate will have expertise in cloud-based data solutions, ETL development, and big data technologies.
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines using AWS services
• Build and optimize ETL/ELT processes for structured and unstructured data
• Work with large datasets in data lakes and data warehouses
• Implement data models and ensure data quality and integrity
• Collaborate with data analysts, data scientists, and stakeholders
• Monitor and troubleshoot data workflows and performance issues
• Implement security and compliance best practices in AWS environments
• Automate data processes using CI/CD pipelines
Required Skills & Qualifications:
• 5+ years of experience in Data Engineering
• Strong hands-on experience with AWS services such as:
• Amazon S3
• AWS Glue
• Amazon Redshift
• Amazon EMR
• AWS Lambda
• Amazon RDS
• AWS Athena
• Experience with ETL tools and data pipeline development
• Strong SQL and Python/Scala programming skills
• Experience with big data technologies (Spark, Hadoop)
• Knowledge of data modeling and data warehousing concepts
• Experience with DevOps tools (CI/CD, Jenkins, Git)
• Good understanding of IAM roles, security, and networking in AWS
Preferred Qualifications:
• AWS Certified Data Analytics or AWS Certified Solutions Architect
• Experience with Snowflake or Databricks
• Experience working in Agile environments






