

Aptonet Inc
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (W2 Only) in New York, NY (Hybrid). Contract length is unspecified, with a pay rate not disclosed. Requires 3-6 years of experience, expertise in Snowflake, Apache Airflow, SQL, and cloud environments (AWS, Azure, or GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Ingestion #Metadata #SQL (Structured Query Language) #Apache Airflow #Data Pipeline #"ETL (Extract #Transform #Load)" #Datasets #Data Governance #Data Management #AWS (Amazon Web Services) #Data Science #Data Warehouse #Documentation #Security #Cloud #Data Processing #Monitoring #Snowflake #Programming #Python #Batch #Data Modeling #Azure #Airflow #Scala #Data Analysis #Data Engineering
Role description
Job Title: Data Engineer (W2 Only)
📍 Location: New York, NY (Hybrid – 1 day per week On-site)
🕓 Interview Process: 3 Rounds (includes in-person technical interview with Product Manager, Nikhil)
💼 Employment Type: W2 Only
Position Overview
We are seeking a talented Data Engineer (W2 Only) to design, develop, and maintain scalable data pipelines and architectures that support analytics, BI, and data science initiatives. You’ll work closely with cross-functional teams—including product, analytics, and engineering—to build reliable, high-performance data solutions using Snowflake, Airflow, and CDC techniques.
Key Responsibilities
• Design, build, and maintain batch and near-real-time data ingestion pipelines leveraging Change Data Capture (CDC) from source systems.
• Develop and manage orchestration workflows using Apache Airflow, including DAG creation, scheduling, and monitoring.
• Utilize Snowflake as the core data warehouse—design schemas, optimize performance, and ensure security and cost efficiency.
• Implement ETL/ELT transformations to deliver clean, trusted datasets for analytics and reporting.
• Partner with data analysts, data scientists, and business stakeholders to define requirements and deliver scalable solutions.
• Establish and maintain data governance, documentation, lineage, and metadata management best practices.
• Continuously monitor and optimize pipeline performance, reliability, and scalability.
• Stay updated with emerging tools, technologies, and frameworks to drive innovation and efficiency.
Required Qualifications
• 3–6 years of professional experience in Data Engineering or a related field.
• Proven hands-on experience with Snowflake (data modeling, performance tuning, and security).
• Experience developing and managing CDC pipelines for real-time or near-real-time data processing.
• Strong experience with Apache Airflow or similar workflow orchestration tools.
• Advanced proficiency in SQL and at least one programming language (preferably Python).
• Strong understanding of data modeling, ETL/ELT processes, and data warehousing principles.
• Experience in cloud environments (AWS, Azure, or GCP).
• Excellent analytical, problem-solving, and cross-functional collaboration skills.
• Strong verbal and written communication abilities.
Why Join Us
Join a forward-thinking team in New York that’s passionate about leveraging data to drive business outcomes. This role offers an opportunity to work with modern data technologies, contribute to architecture decisions, and make a tangible impact in a collaborative, growth-oriented environment.
Job Title: Data Engineer (W2 Only)
📍 Location: New York, NY (Hybrid – 1 day per week On-site)
🕓 Interview Process: 3 Rounds (includes in-person technical interview with Product Manager, Nikhil)
💼 Employment Type: W2 Only
Position Overview
We are seeking a talented Data Engineer (W2 Only) to design, develop, and maintain scalable data pipelines and architectures that support analytics, BI, and data science initiatives. You’ll work closely with cross-functional teams—including product, analytics, and engineering—to build reliable, high-performance data solutions using Snowflake, Airflow, and CDC techniques.
Key Responsibilities
• Design, build, and maintain batch and near-real-time data ingestion pipelines leveraging Change Data Capture (CDC) from source systems.
• Develop and manage orchestration workflows using Apache Airflow, including DAG creation, scheduling, and monitoring.
• Utilize Snowflake as the core data warehouse—design schemas, optimize performance, and ensure security and cost efficiency.
• Implement ETL/ELT transformations to deliver clean, trusted datasets for analytics and reporting.
• Partner with data analysts, data scientists, and business stakeholders to define requirements and deliver scalable solutions.
• Establish and maintain data governance, documentation, lineage, and metadata management best practices.
• Continuously monitor and optimize pipeline performance, reliability, and scalability.
• Stay updated with emerging tools, technologies, and frameworks to drive innovation and efficiency.
Required Qualifications
• 3–6 years of professional experience in Data Engineering or a related field.
• Proven hands-on experience with Snowflake (data modeling, performance tuning, and security).
• Experience developing and managing CDC pipelines for real-time or near-real-time data processing.
• Strong experience with Apache Airflow or similar workflow orchestration tools.
• Advanced proficiency in SQL and at least one programming language (preferably Python).
• Strong understanding of data modeling, ETL/ELT processes, and data warehousing principles.
• Experience in cloud environments (AWS, Azure, or GCP).
• Excellent analytical, problem-solving, and cross-functional collaboration skills.
• Strong verbal and written communication abilities.
Why Join Us
Join a forward-thinking team in New York that’s passionate about leveraging data to drive business outcomes. This role offers an opportunity to work with modern data technologies, contribute to architecture decisions, and make a tangible impact in a collaborative, growth-oriented environment.





