New York Technology Partners

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Snowflake, Azure Data Factory, SQL, and Python, with a focus on building scalable data pipelines and optimizing cloud environments.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta Metropolitan Area
-
🧠 - Skills detailed
#Data Science #Databases #Security #GitHub #SQL (Structured Query Language) #ADLS (Azure Data Lake Storage) #Azure #Vault #Data Quality #Data Engineering #Scala #Data Pipeline #Microsoft Azure #Monitoring #Cloud #Data Lake #Azure Machine Learning #Logging #Storage #ADF (Azure Data Factory) #Azure Security #Synapse #"ETL (Extract #Transform #Load)" #Azure ADLS (Azure Data Lake Storage) #AI (Artificial Intelligence) #Azure Data Factory #Data Analysis #Databricks #ML (Machine Learning) #Python #Azure DevOps #DevOps #Snowflake
Role description
About the Role Our client is seeking Senior Data Engineers to help design and build modern cloud-based data solutions. This role focuses on developing scalable data pipelines, optimizing Snowflake environments, and delivering high‑quality analytics capabilities for business stakeholders. This is a highly visible project supporting enterprise‑level data initiatives across a large, well‑known brand portfolio. Key Responsibilities • Design, develop, and maintain ELT/ETL pipelines using Azure Data Factory and related Azure services • Build and manage Snowflake databases, schemas, tables, views, streams, tasks, and stored procedures • Integrate structured and semi‑structured data from multiple sources into Azure Data Lake and Snowflake • Implement data quality checks, monitoring, and error handling across pipelines • Optimize Snowflake warehouses and Azure resources for performance and cost efficiency • Apply Azure security and governance best practices including RBAC, Key Vault, and networking controls • Collaborate with data analysts, data scientists, and business stakeholders to deliver end‑to‑end solutions • Implement CI/CD pipelines using Azure DevOps or GitHub Actions Required Experience • Strong hands‑on Snowflake experience (advanced SQL, performance tuning, streams, tasks, stored procedures) • Experience building production data pipelines on Microsoft Azure • Proficiency in SQL and Python • Experience with Azure Data Lake Storage (ADLS) • Strong understanding of data warehousing concepts and dimensional modeling • Experience with monitoring and logging using Azure tools and Snowflake performance tracking Nice to Have • Experience with Azure Synapse, Databricks, or Azure Functions • Experience supporting AI/ML data pipelines • Familiarity with Azure Machine Learning, Azure OpenAI, or Cognitive Services