NextGen | GTA: A Kelly Telecom Company

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a contract basis, focusing on AWS, Databricks, and Snowflake. Required skills include building data pipelines, ETL optimization, and Terraform. Experience in risk mitigation and technical documentation is preferred. Pay rate and location are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
576
-
🗓️ - Date
March 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Philadelphia, PA
-
🧠 - Skills detailed
#Documentation #Data Pipeline #Snowflake #Infrastructure as Code (IaC) #Databricks #Quality Assurance #Terraform #Strategy #Data Engineering #GitHub #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)"
Role description
This is a contract position focused on building and scaling world-class data infrastructure using AWS, Databricks, and Snowflake. You will be at the forefront of our data movement strategy, ensuring that information flows seamlessly and reliably across our enterprise ecosystems. The Opportunity This role is perfect for a builder who thrives on solving complex architectural puzzles and eliminating system vulnerabilities. You will have the unique opportunity to lead the charge in identifying and mitigating single points of failure (SPOF) within a high-growth environment. Beyond the technical challenges, we prioritize a collaborative culture that encourages knowledge sharing and professional development, ensuring you stay at the cutting edge of data engineering best practices. Required Skills & Experience • Proven experience building and maintaining data pipelines in AWS. • Expertise in developing and optimizing ETL jobs within Databricks. • Hands-on experience with Snowflake data warehousing. • Proficiency in Infrastructure as Code using Terraform. • Experience managing CI/CD pipelines via Concourse or GitHub Actions. Desired Skills & Experience • Strong background in technical documentation and creating architectural specifications. • Experience in risk mitigation and high-availability system design. • Background in Quality Assurance and post-delivery support for data products. What You Will Be Doing You will be responsible for the end-to-end development of data pipelines, from requirement gathering to post-production support. You will proactively identify system risks, troubleshoot defects with high urgency, and ensure the absolute reliability of our data delivery. Tech Breakdown • 40% Databricks & ETL • 30% AWS & Snowflake • 20% Terraform & CI/CD • 10% Documentation & Testing Daily Responsibilities • 70% Hands-On Engineering (Coding, Pipeline Building, Troubleshooting) • 20% Team Collaboration & Knowledge Sharing • 10% Technical Specs & Risk Mitigation Strategy Posted By: Nicole Screnci