

Vibotek LLC
Senior Data Engineer / Data Architect (Remote and W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer/Data Architect (Remote, W2) with a contract length of "X months" and a pay rate of "$X/hour." Requires 10+ years in data engineering, expertise in healthcare data systems, and proficiency in cloud platforms and data engineering tools.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
Unknown
-
ποΈ - Date
February 7, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #Airflow #Microsoft Power BI #Qlik #dbt (data build tool) #Predictive Modeling #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Compliance #NLP (Natural Language Processing) #Synapse #Azure Databricks #Redshift #Delta Lake #Lambda (AWS Lambda) #Python #ML (Machine Learning) #PySpark #Cloud #Azure #Scala #Data Engineering #AWS (Amazon Web Services) #Apache Airflow #Databricks #Azure Data Factory #ADF (Azure Data Factory) #Data Pipeline #Snowflake #AWS S3 (Amazon Simple Storage Service) #Spark (Apache Spark) #BI (Business Intelligence) #Computer Science #Data Integration #GDPR (General Data Protection Regulation) #Data Architecture #Data Governance #FHIR (Fast Healthcare Interoperability Resources)
Role description
Senior Data Engineer / Data Architect (Remote and W2) Key Responsibilities:
Design & Architecture: Architect and design large-scale data systems, leveraging cloud technologies such as Azure Databricks, Snowflake, AWS, and Azure Synapse for healthcare data ecosystems.
ETL/ELT Pipelines: Build and optimize data pipelines using dbt, Apache Airflow, Azure Data Factory, and Python/PySpark to manage large volumes of healthcare data.
Healthcare Data Integration: Integrate data from systems such as Epic (Clarity, Caboodle, Tapestry), FHIR, HL7, and claims data for analysis and AI/ML models.
AI & Machine Learning: Implement AI-driven analytics, predictive modeling, and machine learning workflows to automate healthcare data processes and generate actionable insights.
Governance & Compliance: Implement and manage data governance frameworks ensuring HIPAA and GDPR compliance.
Collaboration & Mentorship: Collaborate with cross-functional teams and mentor junior engineers to ensure the success of data engineering initiatives.
Reporting & Business Intelligence: Develop dashboards using tools like Power BI, Qlik Sense, and DOMO, ensuring that data insights are actionable and meet business needs.
Optimization & Performance Tuning: Continuously monitor and optimize data systems for performance, reliability, and scalability.
Key Skills & Experience:
Experience: 10+ years of experience in data engineering or data architecture, particularly in the healthcare or large-scale data systems sector.
Healthcare Systems: Strong knowledge of healthcare data systems, including Epic, FHIR, HL7, claims data, and Revenue Cycle Management (RCM).
Cloud Platforms: Expertise in Azure Databricks, Snowflake, AWS (S3, Glue, Lambda, Redshift), and Azure Synapse.
Data Engineering Tools: Hands-on experience with dbt, Apache Airflow, Azure Data Factory, Python, PySpark, and other data engineering tools.
Machine Learning: Experience with AI-driven analytics, predictive modeling, and tools like Azure AI Foundry and NLP.
BI & Reporting: Proficiency in Power BI, Qlik Sense, DOMO, and other BI/reporting tools.
Compliance: Knowledge of HIPAA, GDPR, and other healthcare data compliance standards.
Team Collaboration: Excellent communication skills and ability to work in a collaborative, remote environment.
Education: Bachelorβs or Masterβs degree in Computer Science, Engineering, or a related field.
Preferred Qualifications:
Certifications in Epic Systems or related healthcare platforms.
Experience with Delta Lake, Unity Catalog, and Lakehouse Architecture.
Expertise in AI, machine learning, and predictive analytics techniques.
Familiarity with large-scale healthcare data integration and governance frameworks.
Senior Data Engineer / Data Architect (Remote and W2) Key Responsibilities:
Design & Architecture: Architect and design large-scale data systems, leveraging cloud technologies such as Azure Databricks, Snowflake, AWS, and Azure Synapse for healthcare data ecosystems.
ETL/ELT Pipelines: Build and optimize data pipelines using dbt, Apache Airflow, Azure Data Factory, and Python/PySpark to manage large volumes of healthcare data.
Healthcare Data Integration: Integrate data from systems such as Epic (Clarity, Caboodle, Tapestry), FHIR, HL7, and claims data for analysis and AI/ML models.
AI & Machine Learning: Implement AI-driven analytics, predictive modeling, and machine learning workflows to automate healthcare data processes and generate actionable insights.
Governance & Compliance: Implement and manage data governance frameworks ensuring HIPAA and GDPR compliance.
Collaboration & Mentorship: Collaborate with cross-functional teams and mentor junior engineers to ensure the success of data engineering initiatives.
Reporting & Business Intelligence: Develop dashboards using tools like Power BI, Qlik Sense, and DOMO, ensuring that data insights are actionable and meet business needs.
Optimization & Performance Tuning: Continuously monitor and optimize data systems for performance, reliability, and scalability.
Key Skills & Experience:
Experience: 10+ years of experience in data engineering or data architecture, particularly in the healthcare or large-scale data systems sector.
Healthcare Systems: Strong knowledge of healthcare data systems, including Epic, FHIR, HL7, claims data, and Revenue Cycle Management (RCM).
Cloud Platforms: Expertise in Azure Databricks, Snowflake, AWS (S3, Glue, Lambda, Redshift), and Azure Synapse.
Data Engineering Tools: Hands-on experience with dbt, Apache Airflow, Azure Data Factory, Python, PySpark, and other data engineering tools.
Machine Learning: Experience with AI-driven analytics, predictive modeling, and tools like Azure AI Foundry and NLP.
BI & Reporting: Proficiency in Power BI, Qlik Sense, DOMO, and other BI/reporting tools.
Compliance: Knowledge of HIPAA, GDPR, and other healthcare data compliance standards.
Team Collaboration: Excellent communication skills and ability to work in a collaborative, remote environment.
Education: Bachelorβs or Masterβs degree in Computer Science, Engineering, or a related field.
Preferred Qualifications:
Certifications in Epic Systems or related healthcare platforms.
Experience with Delta Lake, Unity Catalog, and Lakehouse Architecture.
Expertise in AI, machine learning, and predictive analytics techniques.
Familiarity with large-scale healthcare data integration and governance frameworks.





