

Empiric
AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer, remote for 3-6 months at $100/hr. Key skills include real-time event pipelines, AWS (Redshift, Lambda, S3), advanced Python or Java, Terraform, and CI/CD expertise with GitLab.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
January 10, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Infrastructure as Code (IaC) #S3 (Amazon Simple Storage Service) #Prometheus #Kafka (Apache Kafka) #Data Ingestion #Cloud #Observability #Python #Data Pipeline #Redshift #Terraform #Java #API (Application Programming Interface) #Monitoring #Lambda (AWS Lambda) #Data Engineering #Data Integration #Deployment #GitLab #Automated Testing #AWS (Amazon Web Services) #Scala #Grafana
Role description
Job Title: Data Engineer
Location: US-remote
Contract Length: 3-6 months (rolling)
Hourly Rate: $100/hr
Requirements:
• Engineering the real-time event pipelines and platform data integrations., Kafka and Event-Driven
• Design: Designing consumers for high-volume event streams (Auth/Settlement) and managing schema evolution.
• AWS Cloud Data Engineering: Building scalable data services using Redshift, Lambda, and S3 within a modern cloud architecture.
• Advanced Python or Java: Developing production-grade code for data ingestion, API integrations, and complex business logic.
• Infrastructure as Code: Experience using Terraform to manage AWS data resources and environments.
• CI/CD for Data: Expertise in GitLab CI/CD for automated testing and deployment of data pipelines.
• Monitoring and Observability: Setting up Prometheus, Grafana, or CloudWatch for real-time pipeline health tracking.
Job Title: Data Engineer
Location: US-remote
Contract Length: 3-6 months (rolling)
Hourly Rate: $100/hr
Requirements:
• Engineering the real-time event pipelines and platform data integrations., Kafka and Event-Driven
• Design: Designing consumers for high-volume event streams (Auth/Settlement) and managing schema evolution.
• AWS Cloud Data Engineering: Building scalable data services using Redshift, Lambda, and S3 within a modern cloud architecture.
• Advanced Python or Java: Developing production-grade code for data ingestion, API integrations, and complex business logic.
• Infrastructure as Code: Experience using Terraform to manage AWS data resources and environments.
• CI/CD for Data: Expertise in GitLab CI/CD for automated testing and deployment of data pipelines.
• Monitoring and Observability: Setting up Prometheus, Grafana, or CloudWatch for real-time pipeline health tracking.






