

Empiric
AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer, offering a 3-6 month contract at $75-80 per hour. Key skills include real-time event pipelines, AWS services (Redshift, Lambda, S3), advanced Python or Java, Terraform, and CI/CD expertise with GitLab.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
March 18, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Monitoring #AWS (Amazon Web Services) #Redshift #Scala #Infrastructure as Code (IaC) #GitLab #Data Ingestion #S3 (Amazon Simple Storage Service) #Cloud #Data Integration #Prometheus #Terraform #API (Application Programming Interface) #Python #Grafana #Kafka (Apache Kafka) #Automated Testing #Java #Lambda (AWS Lambda) #Data Engineering #Observability #Deployment #Data Pipeline
Role description
Job Title: AWS Data Engineer
Location: US-remote
Contract Length: 3-6 months (rolling)
Hourly Rate: $75-80 per hour
Requirements:
• Engineering the real-time event pipelines and platform data integrations., Kafka and Event-Driven
• Design: Designing consumers for high-volume event streams (Auth/Settlement) and managing schema evolution.
• AWS Cloud Data Engineering: Building scalable data services using Redshift, Lambda, and S3 within a modern cloud architecture.
• Advanced Python or Java: Developing production-grade code for data ingestion, API integrations, and complex business logic.
• Infrastructure as Code: Experience using Terraform to manage AWS data resources and environments.
• CI/CD for Data: Expertise in GitLab CI/CD for automated testing and deployment of data pipelines.
• Monitoring and Observability: Setting up Prometheus, Grafana, or CloudWatch for real-time pipeline health tracking.
Job Title: AWS Data Engineer
Location: US-remote
Contract Length: 3-6 months (rolling)
Hourly Rate: $75-80 per hour
Requirements:
• Engineering the real-time event pipelines and platform data integrations., Kafka and Event-Driven
• Design: Designing consumers for high-volume event streams (Auth/Settlement) and managing schema evolution.
• AWS Cloud Data Engineering: Building scalable data services using Redshift, Lambda, and S3 within a modern cloud architecture.
• Advanced Python or Java: Developing production-grade code for data ingestion, API integrations, and complex business logic.
• Infrastructure as Code: Experience using Terraform to manage AWS data resources and environments.
• CI/CD for Data: Expertise in GitLab CI/CD for automated testing and deployment of data pipelines.
• Monitoring and Observability: Setting up Prometheus, Grafana, or CloudWatch for real-time pipeline health tracking.





