

Sr Data Engineer USC & GC Only (Max $65/hr on W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer (USC & GC only) with a contract length of "unknown" and a pay rate of up to $65/hr on W2. Key skills include advanced AWS data engineering, complex SQL development, Python for ETL, and serverless ETL pipeline design.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
August 14, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#Lambda (AWS Lambda) #AWS (Amazon Web Services) #Automation #Data Extraction #Data Engineering #Redshift #Athena #"ETL (Extract #Transform #Load)" #Python #DevOps #React #Tableau #Data Ingestion #Microservices #DynamoDB #SageMaker #Looker #ML (Machine Learning) #SNS (Simple Notification Service) #Cloud #SQS (Simple Queue Service) #S3 (Amazon Simple Storage Service) #SQL Queries #BI (Business Intelligence) #AWS Lambda #SQL (Structured Query Language)
Role description
Company Description
Education
Job Description
Profile: Senior Data Engineer with strong data and native AWS skills.
Top 5 Key Skills
β’ Advanced AWS Data Engineering (Redshift, S3, Lambda, DynamoDB, Glue, Athena, Step Functions, SNS/SQS)
β’ Complex SQL Query Development & Optimization
β’ Python for ETL and Automation
β’ Serverless ETL Pipeline Design (especially AWS Lambda + Redshift integration)
β’ BI Tools & Real-time Analytics (e.g., Quicksight, Tableau, Looker)
You have:
Ideally 6+ years of software development experience in a production environment
Expertise writing and optimizing complex SQL queries for data extraction, transformation, and reporting (e.g., SELECT, JOIN, GROUP BY, HAVING, WINDOW functions)
Expertise integrating Redshift Serverless with AWS services such as S3, Glue, Lambda, Athena, DynamoDB and Kinesis BI tools like Tableau, Quicksight, or Looker for real-time analytics and dashboards
Expertise building serverless ETL pipelines with Lambda to automate data ingestion and transformation into Redshift
Experience with Python required; React and Node.js nice to have
Experience building event-driven cloud-based serverless applications and deploying to AWS
Technical knowledge of Cloud Computing, DevOps, and Microservices
Excellent communication skills with the ability to present ideas in business-friendly and user-friendly language
Demonstrated ability to develop and maintain good customer working relationships
Exceptional analytical, conceptual, and problem-solving abilities
Proficient in designing, building, and deploying machine learning models using Amazon SageMaker
Able to prioritize and execute tasks in a high-pressure environment
Additional Information
All your information will be kept confidential according to EEO guidelines.
Company Description
Education
Job Description
Profile: Senior Data Engineer with strong data and native AWS skills.
Top 5 Key Skills
β’ Advanced AWS Data Engineering (Redshift, S3, Lambda, DynamoDB, Glue, Athena, Step Functions, SNS/SQS)
β’ Complex SQL Query Development & Optimization
β’ Python for ETL and Automation
β’ Serverless ETL Pipeline Design (especially AWS Lambda + Redshift integration)
β’ BI Tools & Real-time Analytics (e.g., Quicksight, Tableau, Looker)
You have:
Ideally 6+ years of software development experience in a production environment
Expertise writing and optimizing complex SQL queries for data extraction, transformation, and reporting (e.g., SELECT, JOIN, GROUP BY, HAVING, WINDOW functions)
Expertise integrating Redshift Serverless with AWS services such as S3, Glue, Lambda, Athena, DynamoDB and Kinesis BI tools like Tableau, Quicksight, or Looker for real-time analytics and dashboards
Expertise building serverless ETL pipelines with Lambda to automate data ingestion and transformation into Redshift
Experience with Python required; React and Node.js nice to have
Experience building event-driven cloud-based serverless applications and deploying to AWS
Technical knowledge of Cloud Computing, DevOps, and Microservices
Excellent communication skills with the ability to present ideas in business-friendly and user-friendly language
Demonstrated ability to develop and maintain good customer working relationships
Exceptional analytical, conceptual, and problem-solving abilities
Proficient in designing, building, and deploying machine learning models using Amazon SageMaker
Able to prioritize and execute tasks in a high-pressure environment
Additional Information
All your information will be kept confidential according to EEO guidelines.