Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer contract position lasting over 6 months, offering $50.00 - $65.00 per hour. Key skills required include Snowflake, Azure Data Factory, SQL, and data integration. Remote work location; 3+ years of relevant experience preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date discovered
September 12, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Remote
-
🧠 - Skills detailed
#Data Accuracy #"ETL (Extract #Transform #Load)" #Data Pipeline #Scripting #Compliance #DevOps #Logging #Databricks #Monitoring #SQL (Structured Query Language) #Data Lake #Business Analysis #Version Control #Azure #Computer Science #Data Security #Azure Data Factory #Data Modeling #Python #Data Integration #Scala #Vault #Security #Synapse #Metadata #Data Science #Datasets #Deployment #JSON (JavaScript Object Notation) #GIT #Data Governance #Microsoft Power BI #Snowflake #Data Engineering #Cloud #Kafka (Apache Kafka) #Azure SQL #BI (Business Intelligence) #ML (Machine Learning) #ADF (Azure Data Factory)
Role description
Job Title: Data Engineer – Snowflake & Azure Data Factory Overview:We are seeking a skilled Data Engineer with expertise in Snowflake and Azure Data Factory (ADF) to design, build, and optimize modern data pipelines and cloud-based data solutions. The ideal candidate will have hands-on experience working with large-scale datasets, data integration, ETL/ELT processes, and cloud technologies to enable reliable, scalable, and secure data flows across the organization. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Azure Data Factory to ingest, transform, and load data from multiple sources. Build and optimize data models and warehouses in Snowflake to support analytics, reporting, and data science needs. Implement ETL/ELT processes ensuring high-quality, reliable, and performant data integration. Collaborate with business analysts, data scientists, and other stakeholders to understand requirements and deliver data solutions. Monitor and optimize pipeline performance, troubleshoot failures, and ensure data accuracy. Apply best practices in data security, governance, and compliance. Work with structured, semi-structured, and unstructured data (JSON, Parquet, Avro, etc.). Automate workflows, version control, and CI/CD deployment of data solutions. Document technical designs, data flow diagrams, and metadata for ongoing support. Qualifications: Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). 3+ years of professional experience as a Data Engineer or similar role. Strong expertise in Snowflake (data modeling, performance tuning, query optimization). Hands-on experience with Azure Data Factory (pipeline creation, orchestration, scheduling, monitoring). Proficiency in SQL and scripting languages (Python, Scala, or similar). Experience with Azure ecosystem (Azure SQL DB, Data Lake, Synapse, Key Vault, etc.). Knowledge of data integration best practices, error handling, and logging frameworks. Familiarity with DevOps practices, Git, and CI/CD for data pipelines. Understanding of data governance, security, and compliance standards. Preferred Skills: Experience with Databricks, Azure Synapse, or Power BI. Knowledge of streaming technologies (Kafka, Event Hubs, etc.). Exposure to machine learning pipelines and data science workflows. Why Join Us? Work on cutting-edge cloud data platforms. Collaborate with a highly skilled data & analytics team. Opportunity to shape and scale enterprise-level data engineering solutions. Job Types: Full-time, Contract Pay: $50.00 - $65.00 per hour Benefits: 401(k) Dental insurance Health insurance Paid time off Vision insurance Work Location: Remote