Enzo Tech Group

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of 3 to 6 months, offering a competitive pay rate. Key skills include advanced Python, AWS data ecosystems, and ETL design. Experience in data architecture and big data tools is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 14, 2026
πŸ•’ - Duration
3 to 6 months
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Architecture #Data Pipeline #Leadership #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Databricks #Big Data #GCP (Google Cloud Platform) #Strategy #Spark (Apache Spark) #Scala #Java #AWS (Amazon Web Services) #Python #Data Strategy #Azure
Role description
We’re looking for a Senior Data Engineer / Emerging Architect who can operate at both deep technical and strategic levels someone who’s comfortable owning production delivery today while shaping platform architecture tomorrow. This is a high-impact transformation role within an enterprise environment operating at start-up speed. Role Overview: β€’ Initial Phase (0–6 months): ~70–80% hands-on engineering β€’ Transition Phase: Gradual shift toward architecture, design ownership, and platform strategy β€’ End State: Majority focus on Data Architecture & technical leadership What You’ll Do: β€’ Lead development of production-grade ETL pipelines (Python on AWS) β€’ Stabilize and scale a live enterprise data platform under transformation β€’ Act as a senior technical authority, guiding engineering standards and delivery β€’ Drive architecture decisions, platform evolution, and long-term design β€’ Stay close to the code while influencing enterprise-level data strategy What We’re Looking For Core Expertise: β€’ Deep experience with AWS data ecosystems β€’ Advanced Python engineering (production-level pipelines) β€’ Strong background in ETL / data pipeline design & optimisation Architecture & Bonus Skills: β€’ Experience in data architecture, distributed systems, and SQL β€’ Exposure to Spark, Databricks, or similar big data tools β€’ Multi-cloud familiarity (Azure, GCP) or JVM languages (Scala/Java) Why This Role Stands Out β€’ Own Critical Transformation: Step into a platform that needs senior ownership now β€’ Clear Architecture Pathway: Defined progression into a Data Architect role within 3–6 months β€’ Hands-On + Strategic: Not a β€œslideware” role real engineering impact from day one