Enzo Tech Group

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with a contract length of 3 to 6 months, offering a pay rate of "pay rate". It requires strong AWS experience, advanced Python skills, and ETL pipeline expertise. Start-up experience is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 14, 2026
πŸ•’ - Duration
3 to 6 months
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Java #ADF (Azure Data Factory) #Azure Data Factory #SQL (Structured Query Language) #AWS (Amazon Web Services) #Azure #Data Engineering #Data Architecture #Data Pipeline #Cloud #Databricks #Python #Scala #Spark (Apache Spark) #"ETL (Extract #Transform #Load)"
Role description
We’re hiring a senior AWS Data Engineer to step into a high-impact transformation role inside a large-scale enterprise environment operating at start-up pace. This position is 80% hands-on Data Engineering and 20% Data Architecture ideal for someone who wants to be on the keyboard driving production delivery now, with a clear pathway into architectural ownership. Important: We are specifically looking for individuals who have worked in start-up or high-growth environments people who are comfortable with ambiguity, fast decision-making, and building while stabilizing. What You’ll Be Doing: 80% Data Engineering (Hands-On Delivery) β€’ Build and deploy robust Python-based ETL pipelines on AWS β€’ Stabilize and improve reliability across a live enterprise data platform β€’ Deliver production-ready code during a major AWS restructuring initiative β€’ Step in during critical delivery moments and solve real production issues 20% Data Architecture (Strategic Influence) β€’ Shape long-term platform direction and design decisions β€’ Provide technical guidance to engineering teams (no line management) β€’ Transition into greater architectural ownership within 3–6 months Tech Stack: Core Requirements: β€’ Strong experience in AWS cloud environments β€’ Advanced Python for data engineering β€’ Proven background building and maintaining ETL / Data Pipelines Nice to Have: β€’ Cloud data architecture experience & strong SQL β€’ Spark experience highly valued β€’ Exposure to Azure Data Factory, GCP, Databricks, Scala, or Java Who This Suits: β€’ Engineers who have worked in start-up or transformation environments β€’ Individuals comfortable owning delivery without layers of bureaucracy β€’ Technically strong professionals who want to evolve into architecture β€’ Builders who prefer solving problems over sitting in slide decks