Vaco by Highspring

Data Engineer (ETL Developer) (Hybrid)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (ETL Developer) on a 9-month contract, offering $55-$65/hr. It requires expertise in ETL development, SQL Server, and healthcare data integration, particularly with payer data and HEDIS measures. Hybrid work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
October 1, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tampa, FL
-
🧠 - Skills detailed
#Azure SQL #Synapse #GIT #Normalization #SQL (Structured Query Language) #Data Quality #Databricks #Data Lake #Data Modeling #Computer Science #Python #ADF (Azure Data Factory) #Data Science #Monitoring #Visualization #Scala #Security #Automation #Microsoft Power BI #Azure Data Factory #SQL Server #SSIS (SQL Server Integration Services) #Compliance #BI (Business Intelligence) #Datasets #Azure #Data Engineering #"ETL (Extract #Transform #Load)" #Azure SQL Database #Scripting #Data Pipeline #Data Governance #Data Privacy #EDW (Enterprise Data Warehouse) #Version Control #Data Warehouse #Indexing
Role description
Position Summary Seeking skilled Data Engineer (ETL Developer) to modernize and optimize our data pipelines. The Data Engineer (ETL Developer) will be responsible for designing, building, and maintaining scalable ETL pipelines to support our analytics and reporting needs. This individual will leverage Azure Data Factory (ADF), SQL Server Integration Services (SSIS), SQL Server, and modern BI platforms (Domo and Power BI) to ensure accurate, timely, and secure data delivery. The ideal candidate will have experience with healthcare data including: payer raw data, NCQA HEDIS quality data, and Medicare Risk Adjustment datasets. • CAN'T work C2C or provide Sponsorship • Duration: 9 month contract Compensation: $55-$65/hr Key Responsibilities • Design, develop, and maintain modernized ETL pipelines using Azure Data Factory, Fabric, Databricks and/or SSIS to integrate multiple data sources into the enterprise data warehouse. • Collaborate with analysts, data scientists, and business stakeholders to deliver clean, reliable, and well-structured data. • Optimize and refine existing ETL processes to improve performance, scalability, and maintainability. • Develop and maintain and/or convert SQL stored procedures, views, functions, and scripts for analytics and reporting. • Implement data quality checks, error handling, and monitoring to ensure accurate and consistent data flows. • Utilize a code repository (Git or similar) to manage, version, and document ETL code. • Support integration of healthcare data sources, including claims, eligibility, provider rosters, and quality/risk adjustment data. • Partner with business teams to enable HEDIS measure calculation, quality reporting, and risk adjustment analytics. • Collaborate with the BI team to deliver data models and datasets for dashboards in Power BI and Domo. • Ensure adherence to data governance, compliance, and security best practices (HIPAA, PHI/PII handling). • Troubleshoot data issues, perform root cause analysis, and implement preventive measures. Required Qualifications • Bachelor's degree in Computer Science, Information Systems, or a related field; or equivalent experience. • 5+ years of professional experience in ETL development and SQL-based data warehouse engineering. • Strong expertise with SQL Server Integration Services (SSIS), and T-SQL. • Proven experience integrating and transforming healthcare payer data (claims, eligibility, encounters). • Proficiency with Git or other version control systems for code management. • Experience supporting data visualization/reporting in Power BI and Domo. • Strong understanding of data modeling, normalization, indexing, and query optimization. • Strong problem-solving skills and ability to communicate technical concepts to non-technical stakeholders. • Familiarity with HIPAA compliance, data privacy, and PHI/PII handling. Preferred Skills • Experience with Azure Data Factory or Fabric (or similar), other Azure services (Azure SQL Database, Data Lake, Synapse). • Working knowledge of NCQA HEDIS quality measures and related data sets. • Experience with Medicare Risk Adjustment (HCC coding, RAF scores) data structures and reporting. • Background in healthcare analytics, value-based care, or health plan operations. • Familiarity with APIs, scripting (Python/PowerShell), and automation for data pipeline orchestration.