

New York Technology Partners
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract, onsite in Sandy Springs, Georgia. Key skills include Snowflake, Azure Data Factory, SQL, and data pipeline development. Experience with Azure services and data warehousing concepts is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta Metropolitan Area
-
🧠 - Skills detailed
#Deployment #Datasets #Azure ADLS (Azure Data Lake Storage) #GitHub #Security #ADF (Azure Data Factory) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Version Control #Data Quality #Data Science #Data Analysis #Scala #Snowflake #Azure DevOps #Azure #Automated Testing #Data Pipeline #ADLS (Azure Data Lake Storage) #AI (Artificial Intelligence) #Data Engineering #SnowPipe #Monitoring #Programming #Python #Logging #Azure Security #Data Lake #Azure Machine Learning #ML (Machine Learning) #Vault #Scripting #Databricks #Synapse #Microsoft Azure #Azure Data Factory #Cloud #Observability #Clustering #Databases #Storage #DevOps
Role description
We are partnered with a digital consultancy that is looking for a Senior Data Engineer to support their leading restaurant chains. This position starts as a 6-month contract with the potential to extend or convert full-time. Candidates must be comfortable working onsite 4-day per week at the client site in Sandy Springs, Georgia. Sponsorship is not provided at this time.
Job Description
The Data Engineer will design, build, and optimize cloud data solutions using Snowflake on Microsoft Azure. The role focuses on scalable data pipelines, robust data models, and secure, performant analytics capabilities for business stakeholders.
Key Responsibilities
• Design, develop, and maintain ELT/ETL data pipelines using Azure Data Factory (ADF)and related Azure services.
• Build and manage Snowflake databases, schemas, tables, views, streams, tasks, and stored procedures.
• Integrate data from diverse sources into Azure Data Lake Storage and Snowflake, handling both structured and semi structured data.
• Implement and monitor data quality checks, error handling, and observability for pipelines across Azure and Snowflake.
• Optimize Snowflake warehouses and Azure resources for performance and cost (query tuning, warehouse sizing, partitioning/clustering, scheduling).
• Apply Azure security and governance best practices, including RBAC, key vault, and network/security configuration, alongside Snowflake roles and access controls.
• Collaborate with data analysts, data scientists, and business teams to translate requirements into end to end solutions on Azure + Snowflake.
• Use CI/CD (e.g., Azure DevOps/GitHub Actions) for version control, automated testing, and deployment of data engineering assets.
Required Skills & Experience
• Strong hands on experience with Snowflake (advanced SQL, performance tuning, Snowpipe/streams/tasks, stored procedures).
• Practical experience building production data pipelines on Azure (ADF; plus nice if Synapse, Databricks, or Functions).
• Proficiency in SQL and at least one scripting/programming language (e.g., Python or PowerShell).
• Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT best practices.
• Experience working with Azure Data Lake Storage (ADLS) and integrating it with Snowflake.
• Familiarity with monitoring and logging using Azure tools (e.g., Azure Monitor, Log Analytics) and Snowflake query/performance history.
• Experience in building data pipelines for AI/ML use cases (feature stores, model training datasets, inference data flows).
• Familiarity with Azure AI/ML services such as Azure Machine Learning, Cognitive Services, or Azure OpenAI.
We are partnered with a digital consultancy that is looking for a Senior Data Engineer to support their leading restaurant chains. This position starts as a 6-month contract with the potential to extend or convert full-time. Candidates must be comfortable working onsite 4-day per week at the client site in Sandy Springs, Georgia. Sponsorship is not provided at this time.
Job Description
The Data Engineer will design, build, and optimize cloud data solutions using Snowflake on Microsoft Azure. The role focuses on scalable data pipelines, robust data models, and secure, performant analytics capabilities for business stakeholders.
Key Responsibilities
• Design, develop, and maintain ELT/ETL data pipelines using Azure Data Factory (ADF)and related Azure services.
• Build and manage Snowflake databases, schemas, tables, views, streams, tasks, and stored procedures.
• Integrate data from diverse sources into Azure Data Lake Storage and Snowflake, handling both structured and semi structured data.
• Implement and monitor data quality checks, error handling, and observability for pipelines across Azure and Snowflake.
• Optimize Snowflake warehouses and Azure resources for performance and cost (query tuning, warehouse sizing, partitioning/clustering, scheduling).
• Apply Azure security and governance best practices, including RBAC, key vault, and network/security configuration, alongside Snowflake roles and access controls.
• Collaborate with data analysts, data scientists, and business teams to translate requirements into end to end solutions on Azure + Snowflake.
• Use CI/CD (e.g., Azure DevOps/GitHub Actions) for version control, automated testing, and deployment of data engineering assets.
Required Skills & Experience
• Strong hands on experience with Snowflake (advanced SQL, performance tuning, Snowpipe/streams/tasks, stored procedures).
• Practical experience building production data pipelines on Azure (ADF; plus nice if Synapse, Databricks, or Functions).
• Proficiency in SQL and at least one scripting/programming language (e.g., Python or PowerShell).
• Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT best practices.
• Experience working with Azure Data Lake Storage (ADLS) and integrating it with Snowflake.
• Familiarity with monitoring and logging using Azure tools (e.g., Azure Monitor, Log Analytics) and Snowflake query/performance history.
• Experience in building data pipelines for AI/ML use cases (feature stores, model training datasets, inference data flows).
• Familiarity with Azure AI/ML services such as Azure Machine Learning, Cognitive Services, or Azure OpenAI.






