

Mastech Digital
Cloud Engineer (W2 Candidates Only)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Engineer (AWS Data Engineer) in Seattle, WA, for 12+ months at a W2 pay rate. Requires 5+ years in AWS (Redshift, S3), Python, SQL, ETL, and data modeling, with SAP integration experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 23, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#Amazon EMR (Amazon Elastic MapReduce) #"ETL (Extract #Transform #Load)" #AWS Glue #Big Data #Data Pipeline #AWS S3 (Amazon Simple Storage Service) #Data Modeling #Data Engineering #Scrum #SAP #Cloud #Python #Web Services #Lambda (AWS Lambda) #Oracle #Redshift #Scala #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Data Lake #AWS (Amazon Web Services) #EC2
Role description
Job Title: AWS Data Engineer
Location: Seattle, WA
Duration: 12+ Months
📢 Note: No C2C / No C2H — Only W2 Candidates
Job Description:
We are seeking an experienced AWS Data Engineer to join our team in Seattle. The ideal candidate will have strong experience designing and implementing integrations between SAP and external applications (such as Oracle or homegrown systems) using AWS S3 and Redshift. This position requires close collaboration with business and IT stakeholders to gather requirements, apply industry best practices, and deliver high-quality, scalable data solutions.
Required Skills & Experience:
• 5+ years of experience working with business teams to translate requirements into technical solutions.
• 5+ years of hands-on experience in Advanced Python, SQL, data modeling, ETL development, and data warehousing.
• 5+ years of direct experience with AWS technologies, including:
• AWS Stack: Redshift and S3 (mandatory), EC2 (preferred)
• Big Data Platforms: Amazon EMR, Data Lakes
• Data Pipelines: AWS Glue, Serverless, Lambda, Step Functions
• Languages: SQL / Python
• Proven ability to conduct design workshops, gather requirements, configure systems, and perform testing.
• Strong analytical and problem-solving skills with excellent communication and teamwork abilities.
• Experience participating in design discussions, scrum calls, and providing regular status updates.
Nice-to-Have Skills:
• In-depth knowledge of Amazon Web Services (AWS) ecosystem.
• Additional experience with Python and broader AWS services (Redshift, S3, EC2).
Job Title: AWS Data Engineer
Location: Seattle, WA
Duration: 12+ Months
📢 Note: No C2C / No C2H — Only W2 Candidates
Job Description:
We are seeking an experienced AWS Data Engineer to join our team in Seattle. The ideal candidate will have strong experience designing and implementing integrations between SAP and external applications (such as Oracle or homegrown systems) using AWS S3 and Redshift. This position requires close collaboration with business and IT stakeholders to gather requirements, apply industry best practices, and deliver high-quality, scalable data solutions.
Required Skills & Experience:
• 5+ years of experience working with business teams to translate requirements into technical solutions.
• 5+ years of hands-on experience in Advanced Python, SQL, data modeling, ETL development, and data warehousing.
• 5+ years of direct experience with AWS technologies, including:
• AWS Stack: Redshift and S3 (mandatory), EC2 (preferred)
• Big Data Platforms: Amazon EMR, Data Lakes
• Data Pipelines: AWS Glue, Serverless, Lambda, Step Functions
• Languages: SQL / Python
• Proven ability to conduct design workshops, gather requirements, configure systems, and perform testing.
• Strong analytical and problem-solving skills with excellent communication and teamwork abilities.
• Experience participating in design discussions, scrum calls, and providing regular status updates.
Nice-to-Have Skills:
• In-depth knowledge of Amazon Web Services (AWS) ecosystem.
• Additional experience with Python and broader AWS services (Redshift, S3, EC2).






