

Aptonet Inc
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in New York (hybrid, 1 day on-site) with a contract length of "unknown" and a pay rate of "unknown." Requires 3-6 years of experience, expertise in Snowflake, Apache Airflow, and CDC techniques.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Ingestion #Schema Design #SQL (Structured Query Language) #Apache Airflow #Data Pipeline #Clustering #"ETL (Extract #Transform #Load)" #Datasets #Data Governance #AWS (Amazon Web Services) #Data Science #Data Warehouse #Vault #Documentation #Security #Cloud #Data Quality #Monitoring #Snowflake #Scala #Programming #Python #Batch #Data Modeling #Azure #Airflow #Data Vault #Data Engineering
Role description
Job Title: Data Engineer
Location: New York Hybrid – 1 day per week on-site
Interview Process: 3 rounds – technical interview will be in person
About the Role
We are seeking a highly skilled Data Engineer to design, build, and maintain reliable data pipelines and architectures that empower analytics, BI, and data science teams. The ideal candidate will be hands-on with Snowflake, Airflow, and modern CDC (Change Data Capture) techniques to ensure data is accurate, timely, and scalable.
This role will require close collaboration across teams — including product, analytics, and engineering — to translate complex business needs into robust technical solutions.
Key Responsibilities
• Design, develop, and operate batch and near-real-time data ingestion pipelines leveraging CDC from multiple source systems.
• Build and manage orchestration workflows using Apache Airflow, including DAG design, dependencies, recoverability, and alerting.
• Utilize Snowflake as the core data warehouse platform: schema design, query optimization, partitioning, clustering, cost monitoring, and access control.
• Develop and maintain ETL/ELT processes to deliver clean, well-structured datasets for downstream consumption.
• Design and implement scalable data models (dimensional, data vault, etc.) aligned with business domains.
• Monitor and improve data platform performance, reliability, and cost efficiency.
• Collaborate with cross-functional stakeholders to understand requirements, support analytics initiatives, and provide high-quality data solutions.
• Implement best practices for data governance, documentation, lineage, and data quality.
• Stay up to date with emerging data engineering tools and technologies to continuously improve architecture and processes.
Required Qualifications
• 3–6 years of professional experience in data engineering or a closely related field.
• Proven hands-on experience with Snowflake, including data warehouse design, performance tuning, and security.
• Strong experience implementing Change Data Capture (CDC) pipelines.
• Proficiency with Apache Airflow (or similar orchestration frameworks).
• Advanced SQL skills and proficiency in Python or another programming language.
• Solid understanding of data modeling, ETL/ELT patterns, and data warehousing best practices.
• Experience working in cloud environments (AWS, Azure, or GCP).
• Strong problem-solving skills, attention to detail, and a collaborative mindset.
• Excellent communication skills with the ability to translate business needs into technical solutions.
Job Title: Data Engineer
Location: New York Hybrid – 1 day per week on-site
Interview Process: 3 rounds – technical interview will be in person
About the Role
We are seeking a highly skilled Data Engineer to design, build, and maintain reliable data pipelines and architectures that empower analytics, BI, and data science teams. The ideal candidate will be hands-on with Snowflake, Airflow, and modern CDC (Change Data Capture) techniques to ensure data is accurate, timely, and scalable.
This role will require close collaboration across teams — including product, analytics, and engineering — to translate complex business needs into robust technical solutions.
Key Responsibilities
• Design, develop, and operate batch and near-real-time data ingestion pipelines leveraging CDC from multiple source systems.
• Build and manage orchestration workflows using Apache Airflow, including DAG design, dependencies, recoverability, and alerting.
• Utilize Snowflake as the core data warehouse platform: schema design, query optimization, partitioning, clustering, cost monitoring, and access control.
• Develop and maintain ETL/ELT processes to deliver clean, well-structured datasets for downstream consumption.
• Design and implement scalable data models (dimensional, data vault, etc.) aligned with business domains.
• Monitor and improve data platform performance, reliability, and cost efficiency.
• Collaborate with cross-functional stakeholders to understand requirements, support analytics initiatives, and provide high-quality data solutions.
• Implement best practices for data governance, documentation, lineage, and data quality.
• Stay up to date with emerging data engineering tools and technologies to continuously improve architecture and processes.
Required Qualifications
• 3–6 years of professional experience in data engineering or a closely related field.
• Proven hands-on experience with Snowflake, including data warehouse design, performance tuning, and security.
• Strong experience implementing Change Data Capture (CDC) pipelines.
• Proficiency with Apache Airflow (or similar orchestration frameworks).
• Advanced SQL skills and proficiency in Python or another programming language.
• Solid understanding of data modeling, ETL/ELT patterns, and data warehousing best practices.
• Experience working in cloud environments (AWS, Azure, or GCP).
• Strong problem-solving skills, attention to detail, and a collaborative mindset.
• Excellent communication skills with the ability to translate business needs into technical solutions.





