Senior Data Engineer - 100% Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract-to-hire basis, offering 100% remote work. Requires 7+ years of experience in Azure Databricks and Data Factory, strong Python and SQL skills, and expertise in ETL processes within regulated industries.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 12, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Deployment #Azure Data Factory #Python #Data Lake #Automated Testing #Infrastructure as Code (IaC) #Compliance #Data Governance #Data Modeling #Programming #Data Warehouse #Computer Science #GDPR (General Data Protection Regulation) #Data Accuracy #Azure #Spark (Apache Spark) #Data Architecture #Continuous Deployment #SQL (Structured Query Language) #Azure Databricks #Apache Spark #Docker #Data Quality #Data Processing #Databricks #Debugging #Version Control #BI (Business Intelligence) #Datasets #Data Engineering #ADF (Azure Data Factory) #Data Science #"ETL (Extract #Transform #Load)" #Scala #Big Data #Data Pipeline #Security #DevOps #Cloud
Role description
Senior Data Engineer 100% Remote | Contract to Hire Role Overview We are looking for a Senior Data Engineer to design, develop, and optimize scalable data pipelines and workflows within a cloud-based environment. This role will focus on transforming complex, high-volume data into reliable, actionable insights to support business intelligence, advanced analytics, and data science initiatives in highly regulated industries such as healthcare, diagnostics, or life sciences. Key Responsibilities • Design, develop, and maintain scalable and reliable data pipelines using Azure Databricks and Azure Data Factory. • Implement robust ETL/ELT processes to integrate heterogeneous data sources into centralized data warehouses or data lakes, ensuring data accuracy and availability. • Collaborate with cross-functional teams—including data scientists, analysts, and business stakeholders—to translate data requirements into efficient engineering solutions. • Optimize big data workflows for high performance and cost efficiency, handling large-scale structured and unstructured datasets. • Monitor, troubleshoot, and ensure data quality, consistency, and compliance with regulatory requirements (e.g., HIPAA, GDPR) where applicable. • Develop and enforce best practices for data engineering including version control, automated testing, CI/CD pipelines, and deployment. • Mentor junior engineers and contribute to building a data-driven culture and knowledge sharing. • Stay current with emerging data engineering trends, cloud technologies, and industry standards. Required Qualifications • Bachelor’s degree in Computer Science, Information Technology, or a related technical field. • 7+ years of experience in data engineering with deep expertise in Azure Databricks and Azure Data Factory. • Proven track record designing and implementing cloud-native big data architectures and ETL pipelines. • Strong programming skills in Python, SQL, and Apache Spark for distributed data processing. • Solid understanding of data modeling, data warehousing concepts, and modern data lake architectures. • Knowledge of data governance, security, and compliance best practices relevant to regulated environments. • Strong analytical, problem-solving, and debugging skills with complex data workflows. • Excellent communication skills and ability to work collaboratively in cross-functional teams. Preferred Qualifications • Experience implementing DevOps practices and tools for continuous integration and continuous deployment (CI/CD) in data engineering workflows. • Familiarity with containerization (e.g., Docker) and infrastructure as code (IaC) tools. No strenuous physical activity, though occasional light lifting of files and related materials is required. 30% of time in meetings, working with team, or talking on the phone, 70% of the time at the desk on computer, doing analytical work. Minimal travel required. Travel includes airplane, automobile travel and overnight hotel. Works for lengthy periods of time in sitting or standing position and for lengthy periods of time using a computer. Position may require travel, up to 10%, travel includes airplane, automobile travel and overnight hotel. At times, this role demands flexible work hours to meet deadlines, maintain system availability and support business continuity.