

Merit Consulting Group Inc
AWS Data Engineer - W2s Only - GC / USC
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include AWS services, Python, and data pipeline development. Candidates must be USC or GC and have hands-on experience with Amazon Redshift.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Washington, United States
-
🧠 - Skills detailed
#Compliance #Datasets #Migration #AWS (Amazon Web Services) #Terraform #"ETL (Extract #Transform #Load)" #Data Modeling #Data Security #RDS (Amazon Relational Database Service) #AWS Lambda #Python #Infrastructure as Code (IaC) #Automation #Consulting #Cloud #Amazon Redshift #Data Pipeline #Redshift #VPC (Virtual Private Cloud) #Scala #Data Engineering #Athena #AWS Glue #Deployment #EC2 #S3 (Amazon Simple Storage Service) #Monitoring #Lambda (AWS Lambda) #Security #Data Integration #Data Processing
Role description
NOTE: USC / GC Only
Job Summary:
We are seeking a skilled AWS Data Engineer to support the design, development, and maintenance of scalable cloud-based data solutions. The ideal candidate will have strong experience with AWS services, data engineering tools, and automation to support the Deloitte project team.
Key Responsibilities:
• Design, build, and maintain scalable data pipelines using AWS services
• Develop and optimize data workflows using Python, AWS Glue, and Amazon Redshift
• Work with large datasets using Amazon Athena for querying and analysis
• Implement serverless solutions using AWS Lambda
• Build, configure, and manage AWS infrastructure components such as EC2, S3, RDS, and VPCs
• Automate deployment, monitoring, and data processing workflows using AWS native tools
• Troubleshoot and resolve issues related to AWS environments and data integrations
• Ensure data security, compliance, and governance standards are met
• Support cloud migration initiatives for applications and data
• Collaborate with cross-functional teams to deliver high-quality cloud solutions
Required Skills:
• Strong experience with AWS services (Redshift, Glue, Athena, Lambda, S3, EC2, RDS, VPC)
• Proficiency in Python for data processing and automation
• Hands-on experience with Amazon Redshift (Must Have)
• Experience building ETL/ELT pipelines in AWS
• Knowledge of data modeling and data warehousing concepts
• Familiarity with cloud security best practices
Preferred Qualifications:
• Experience working in large enterprise environments (preferably with consulting teams)
• AWS certifications (e.g., AWS Certified Data Analytics or Solutions Architect)
• Experience with CI/CD pipelines and infrastructure as code (CloudFormation/Terraform)
NOTE: USC / GC Only
Job Summary:
We are seeking a skilled AWS Data Engineer to support the design, development, and maintenance of scalable cloud-based data solutions. The ideal candidate will have strong experience with AWS services, data engineering tools, and automation to support the Deloitte project team.
Key Responsibilities:
• Design, build, and maintain scalable data pipelines using AWS services
• Develop and optimize data workflows using Python, AWS Glue, and Amazon Redshift
• Work with large datasets using Amazon Athena for querying and analysis
• Implement serverless solutions using AWS Lambda
• Build, configure, and manage AWS infrastructure components such as EC2, S3, RDS, and VPCs
• Automate deployment, monitoring, and data processing workflows using AWS native tools
• Troubleshoot and resolve issues related to AWS environments and data integrations
• Ensure data security, compliance, and governance standards are met
• Support cloud migration initiatives for applications and data
• Collaborate with cross-functional teams to deliver high-quality cloud solutions
Required Skills:
• Strong experience with AWS services (Redshift, Glue, Athena, Lambda, S3, EC2, RDS, VPC)
• Proficiency in Python for data processing and automation
• Hands-on experience with Amazon Redshift (Must Have)
• Experience building ETL/ELT pipelines in AWS
• Knowledge of data modeling and data warehousing concepts
• Familiarity with cloud security best practices
Preferred Qualifications:
• Experience working in large enterprise environments (preferably with consulting teams)
• AWS certifications (e.g., AWS Certified Data Analytics or Solutions Architect)
• Experience with CI/CD pipelines and infrastructure as code (CloudFormation/Terraform)






