

New York Technology Partners
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer, 5 days onsite in Atlanta, with a pay rate of "TBD." Requires strong proficiency in Python and SQL, experience with Snowflake, dbt, and AWS Airflow, and the ability to work independently in a client-facing setting.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date
April 30, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#Python #Data Quality #Data Pipeline #AI (Artificial Intelligence) #AWS (Amazon Web Services) #Snowflake #Data Engineering #Automation #Airflow #Data Integration #Cloud #dbt (data build tool) #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Scala #SQL (Structured Query Language)
Role description
β’
β’
β’ THIS ROLE IS 5 DAYS ONSITE IN ATLANTA. US CITIZEN OR GC HOLDER ONLY!
β’
β’
β’ Project Overview
This role supports a key data engineering workstream focused on building and maintaining enterprise-grade data pipelines in a modern cloud environment. The ideal candidate can ramp quickly, work independently, and operate effectively in a client-facing setting.
Key Responsibilities
β’ Design, build, and maintain scalable data pipelines and transformations
β’ Develop and optimize data models within a cloud data ecosystem
β’ Collaborate with cross-functional teams to support data integration and analytics needs
β’ Ensure data quality, reliability, and performance across systems
β’ Work directly with stakeholders in a client-facing capacity, translating requirements into technical solutions
β’ Contribute to best practices around data engineering, automation, and workflow orchestration
Required Skills & Experience
β’ Strong proficiency in Python and SQL (must-have)
β’ Hands-on experience with:
β’ Snowflake
β’ dbt (data build tool)
β’ AWS Airflow (or similar orchestration tools)
β’ Solid understanding of data pipeline architecture and ETL/ELT processes
β’ Experience working in cloud-based environments
β’ Ability to work independently and manage priorities in a fast-paced setting
Preferred / Nice-to-Have
β’ Exposure to AI/ML tools or workflows (not required)
β’ Examples: AWS Bedrock, Snowflake ML
β’ Familiarity with automation and AI-assisted data processes
β’ Experience in pilot or early-stage AI initiatives is a plus
β’
β’
β’ THIS ROLE IS 5 DAYS ONSITE IN ATLANTA. US CITIZEN OR GC HOLDER ONLY!
β’
β’
β’ Project Overview
This role supports a key data engineering workstream focused on building and maintaining enterprise-grade data pipelines in a modern cloud environment. The ideal candidate can ramp quickly, work independently, and operate effectively in a client-facing setting.
Key Responsibilities
β’ Design, build, and maintain scalable data pipelines and transformations
β’ Develop and optimize data models within a cloud data ecosystem
β’ Collaborate with cross-functional teams to support data integration and analytics needs
β’ Ensure data quality, reliability, and performance across systems
β’ Work directly with stakeholders in a client-facing capacity, translating requirements into technical solutions
β’ Contribute to best practices around data engineering, automation, and workflow orchestration
Required Skills & Experience
β’ Strong proficiency in Python and SQL (must-have)
β’ Hands-on experience with:
β’ Snowflake
β’ dbt (data build tool)
β’ AWS Airflow (or similar orchestration tools)
β’ Solid understanding of data pipeline architecture and ETL/ELT processes
β’ Experience working in cloud-based environments
β’ Ability to work independently and manage priorities in a fast-paced setting
Preferred / Nice-to-Have
β’ Exposure to AI/ML tools or workflows (not required)
β’ Examples: AWS Bedrock, Snowflake ML
β’ Familiarity with automation and AI-assisted data processes
β’ Experience in pilot or early-stage AI initiatives is a plus






