Enzo Tech Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Contract-to-Hire) focused on AWS, requiring strong hands-on experience in building data pipelines, Terraform for infrastructure management, and proficiency in Python/SQL. The position emphasizes end-to-end ownership in a cloud-native environment.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 22, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Terraform #Cloud #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #AWS Lambda #Data Pipeline #Programming #Data Engineering #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Python #Lambda (AWS Lambda) #Automation #AWS (Amazon Web Services) #Data Processing #Scala
Role description
AWS Data Engineer (Contract-to-Hire) | Terraform | Lambda | End-to-End Ownership We’re partnering with Navitus to hire a high-calibre AWS Data Engineer to help architect and scale a cloud-native data platform from the ground up. This role sits at the intersection of Data Engineering and Cloud Infrastructure, and is ideal for someone who has built production-grade data pipelines in AWS and understands how to provision, automate, and optimise the infrastructure those pipelines run on. You’ll be joining a team focused on modernising their data ecosystem, moving away from legacy approaches and toward a fully automated, scalable, AWS-native environment. The expectation is impact from day one. This is a hands-on build role, not a support or maintenance position. What You’ll Be Doing: Design, build, and optimise scalable data pipelines (ETL/ELT) in AWS Implement Infrastructure as Code (Terraform) to provision and manage cloud resources Develop serverless data processing solutions using AWS Lambda Work across services like S3, Glue, Step Functions, IAM, CloudWatch or similar Ensure performance, scalability, and reliability of data workflows Take end-to-end ownership of both data and infrastructure layers Tech Environment: β€’ AWS (core platform) β€’ Terraform (Infrastructure as Code) β€’ AWS Lambda (serverless compute) β€’ Python / SQL (for data processing) β€’ Modern data tooling within a cloud-native architecture What We’re Looking For: Strong hands-on experience with AWS in a data engineering context Proven experience building and maintaining data pipelines or ETL systems Experience with Terraform or similar IaC tools such as CloudFormation or CDK Solid programming skills in Python and/or SQL Understanding of cloud infrastructure, automation, and distributed systems Ability to operate independently and deliver with minimal oversight