

Enzo Tech Group
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (AWS) position for a 6-month contract-to-hire, offering flexible pay and remote work. Key skills include AWS data services, Python, ETL development, and data governance experience; Collibra and Terraform familiarity are a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Terraform #Scala #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Governance #Data Quality #Python #Collibra #Data Pipeline
Role description
Data Engineer (AWS)
Company: Healthcare
Location: Remote / Flexible
Engagement: 6-Month Contract-to-Hire
Rate: Flexible
Overview
A Healthcare organisation is seeking a hands-on Data Engineer to join a growing AWS-based data team. This role focuses on building scalable data pipelines while driving improvements in data governance and data quality across the platform.
Key Responsibilities
• Build and maintain ETL/data pipelines using Python on AWS
• Own and improve production data workflows
• Support and enhance data governance and quality frameworks
• Collaborate with engineering and architecture teams
• Help define and implement data best practices
Required Skills
• Strong experience with AWS data services
• Proficiency in Python
• Experience with ETL / pipeline development
• Exposure to data governance and data quality
• Experience with Collibra (nice to have)
• Familiarity with Terraform (IaC)
Data Engineer (AWS)
Company: Healthcare
Location: Remote / Flexible
Engagement: 6-Month Contract-to-Hire
Rate: Flexible
Overview
A Healthcare organisation is seeking a hands-on Data Engineer to join a growing AWS-based data team. This role focuses on building scalable data pipelines while driving improvements in data governance and data quality across the platform.
Key Responsibilities
• Build and maintain ETL/data pipelines using Python on AWS
• Own and improve production data workflows
• Support and enhance data governance and quality frameworks
• Collaborate with engineering and architecture teams
• Help define and implement data best practices
Required Skills
• Strong experience with AWS data services
• Proficiency in Python
• Experience with ETL / pipeline development
• Exposure to data governance and data quality
• Experience with Collibra (nice to have)
• Familiarity with Terraform (IaC)





