

S3 Shared Service Solutions, LLC
DHCF Senior Data Engineer
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DHCF Senior Data Engineer on a contract basis, hybrid remote in Washington, DC. Pay is $100.00 - $101.00 per hour. Requires 6-10 years of experience, expertise in SSIS, Azure Databricks, and Medicaid data structures, plus relevant certifications.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
808
-
๐๏ธ - Date
February 3, 2026
๐ - Duration
Unknown
-
๐๏ธ - Location
Hybrid
-
๐ - Contract
Unknown
-
๐ - Security
Unknown
-
๐ - Location detailed
Washington, DC 20001
-
๐ง - Skills detailed
#Synapse #Security #Version Control #Deployment #Data Warehouse #Azure DevOps #Azure cloud #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Compliance #Azure #Databricks #SSIS (SQL Server Integration Services) #Data Engineering #ADF (Azure Data Factory) #GIT #Migration #Data Governance #SQL (Structured Query Language) #SSAS (SQL Server Analysis Services) #Delta Lake #Data Quality #SQL Server #PySpark #SSRS (SQL Server Reporting Services) #Azure Databricks #Azure Data Factory #Cloud #DevOps #CMS (Content Management System) #AI (Artificial Intelligence)
Role description
DHCF is looking for a senior data engineer for its data modernization efforts associated with its Medicaid data ecosystem.
1. Position Purpose
The Senior Data Engineer serves as the primary technical engine for the agency's Medicaid data ecosystem. This role is unique: it requires a high-level mastery of our current legacy environmentโcharacterized by SSIS ETL processes managed via Team Foundation Server (TFS)โwhile actively spearheading the execution of our cloud modernization roadmap. Under the direction of the Lead Data Warehouse Solution Architect, you will ensure the stability of current Medicaid reporting while building the future-state Azure Synapse and Databricks Lakehouse.
1. Key Responsibilities
A. Legacy Maintenance & Operational Excellence (Current State)
ETL Management: Maintain, troubleshoot, and modify complex SSIS packages handling high-volume Medicaid claims, provider, and member data.
Version Control: Manage code deployments and branching strategies within TFS, ensuring continuous integration of legacy SQL assets.
Legacy Reporting Support: Support and optimize SSRS report queries and SSAS tabular/multidimensional models to ensure federal and state compliance reporting remains uninterrupted.
B. Modernization & Migration Execution (Future State)
Cloud Development: Implement "Medallion Architecture" (Bronze/Silver/Gold) using Azure Databricks (PySpark/SQL) as designed by the Lead Architect.
Pipeline Refactoring: Lead the transition of legacy SSIS logic into Azure Data Factory (ADF) and Databricks notebooks.
DevOps Transformation: Facilitate the migration of source control and CI/CD pipelines from TFS to Azure DevOps (Git).
Synapse Integration: Build and tune Dedicated and Serverless SQL Pools within Azure Synapse to facilitate advanced analytics and AI-readiness.
C. Data Governance & Security
Medicaid Compliance: Implement Row-Level Security (RLS) and automated data masking for PHI/PII in accordance with HIPAA, CMS MARS-E, and NIST standards.
Data Quality: Develop automated data validation frameworks to ensure data parity between legacy SQL systems and the new Cloud Lakehouse.
1. Competencies for Success
Technical Agility: The ability to pivot between a 10-year-old SSIS package and a modern Databricks Spark job in the same day.
Collaboration: Ability to take high-level architectural blueprints from the Lead Architect and translate them into high-performance, production-ready code.
Attention to Detail: Absolute precision in Medicaid data handling, where an error in logic can impact member benefits or federal funding.
------------------------
Job Type: Contract
Pay: $100.00 - $101.00 per hour
Application Question(s):
Current Location Please ?
Experience in Maintaining SQL Server (SSIS/SSAS/SSRS) while concurrently deploying Azure cloud solutions.
Expert-level proficiency in SSIS and T-SQL. Advanced proficiency in Azure Databricks (Unity Catalog, Delta Lake) and Azure Synapse.
Deep experience with TFS (Team Foundation Server) and a strong desire to migrate workflows to Git/Azure DevOps
6-10 yrs. leading advanced technology projects or service projects
Extensive experience with Medicaid/Medicare data structures (e.g., MMIS, EDI 837/835, claims processing).
Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Professional Data Engineer.
Work Location: Hybrid remote in Washington, DC 20001
DHCF is looking for a senior data engineer for its data modernization efforts associated with its Medicaid data ecosystem.
1. Position Purpose
The Senior Data Engineer serves as the primary technical engine for the agency's Medicaid data ecosystem. This role is unique: it requires a high-level mastery of our current legacy environmentโcharacterized by SSIS ETL processes managed via Team Foundation Server (TFS)โwhile actively spearheading the execution of our cloud modernization roadmap. Under the direction of the Lead Data Warehouse Solution Architect, you will ensure the stability of current Medicaid reporting while building the future-state Azure Synapse and Databricks Lakehouse.
1. Key Responsibilities
A. Legacy Maintenance & Operational Excellence (Current State)
ETL Management: Maintain, troubleshoot, and modify complex SSIS packages handling high-volume Medicaid claims, provider, and member data.
Version Control: Manage code deployments and branching strategies within TFS, ensuring continuous integration of legacy SQL assets.
Legacy Reporting Support: Support and optimize SSRS report queries and SSAS tabular/multidimensional models to ensure federal and state compliance reporting remains uninterrupted.
B. Modernization & Migration Execution (Future State)
Cloud Development: Implement "Medallion Architecture" (Bronze/Silver/Gold) using Azure Databricks (PySpark/SQL) as designed by the Lead Architect.
Pipeline Refactoring: Lead the transition of legacy SSIS logic into Azure Data Factory (ADF) and Databricks notebooks.
DevOps Transformation: Facilitate the migration of source control and CI/CD pipelines from TFS to Azure DevOps (Git).
Synapse Integration: Build and tune Dedicated and Serverless SQL Pools within Azure Synapse to facilitate advanced analytics and AI-readiness.
C. Data Governance & Security
Medicaid Compliance: Implement Row-Level Security (RLS) and automated data masking for PHI/PII in accordance with HIPAA, CMS MARS-E, and NIST standards.
Data Quality: Develop automated data validation frameworks to ensure data parity between legacy SQL systems and the new Cloud Lakehouse.
1. Competencies for Success
Technical Agility: The ability to pivot between a 10-year-old SSIS package and a modern Databricks Spark job in the same day.
Collaboration: Ability to take high-level architectural blueprints from the Lead Architect and translate them into high-performance, production-ready code.
Attention to Detail: Absolute precision in Medicaid data handling, where an error in logic can impact member benefits or federal funding.
------------------------
Job Type: Contract
Pay: $100.00 - $101.00 per hour
Application Question(s):
Current Location Please ?
Experience in Maintaining SQL Server (SSIS/SSAS/SSRS) while concurrently deploying Azure cloud solutions.
Expert-level proficiency in SSIS and T-SQL. Advanced proficiency in Azure Databricks (Unity Catalog, Delta Lake) and Azure Synapse.
Deep experience with TFS (Team Foundation Server) and a strong desire to migrate workflows to Git/Azure DevOps
6-10 yrs. leading advanced technology projects or service projects
Extensive experience with Medicaid/Medicare data structures (e.g., MMIS, EDI 837/835, claims processing).
Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Professional Data Engineer.
Work Location: Hybrid remote in Washington, DC 20001






