

Drillo.AI
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in GCP, requiring 8–10 years of experience, particularly in healthcare data systems. Contract length exceeds 6 months, with a pay rate up to $120,000. Key skills include Python, SQL, and HIPAA compliance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
545
-
🗓️ - Date
November 13, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Governance #Cloud #Version Control #Agile #Metadata #Compliance #ML (Machine Learning) #BigQuery #Leadership #Data Warehouse #Dataflow #GIT #Data Engineering #Data Privacy #Looker #PySpark #Data Modeling #Scala #Data Quality #Storage #Apache Beam #Data Integration #Data Transformations #Data Lake #Scrum #Jenkins #Airflow #Data Management #GCP (Google Cloud Platform) #Microsoft Power BI #SQL (Structured Query Language) #AWS (Amazon Web Services) #Azure #DevOps #JSON (JavaScript Object Notation) #Data Accuracy #Data Analysis #"ETL (Extract #Transform #Load)" #Terraform #Python #Datasets #Docker #AI (Artificial Intelligence) #Data Pipeline #Spark (Apache Spark) #Data Science #BI (Business Intelligence) #FHIR (Fast Healthcare Interoperability Resources) #Infrastructure as Code (IaC) #API (Application Programming Interface) #Jira
Role description
Job Title: Senior Data Engineer – Google Cloud Platform (GCP)
Location: Remote, USA
Experience Level: 8–10 years
Employment Type: Full-Time (W2 or 1099)
Compensation: Up to $120,000 per year + Benefits (based on experience)
About the Role
We are seeking an experienced Senior Data Engineer (GCP) with strong expertise in healthcare data systems to join our growing data engineering team. The ideal candidate will have a deep understanding of HIPAA-compliant architectures, FHIR/HL7 data standards, and GCP cloud-native data platforms. You’ll be responsible for building secure, scalable, and compliant data pipelines that power clinical analytics, population health, and healthcare operations.
Key Responsibilities
• Design, build, and maintain HIPAA-compliant ETL/ELT data pipelines using Dataflow, Dataproc, or Apache Beam.
• Develop robust and optimized data warehouses and marts using BigQuery and Cloud Storage.
• Work with healthcare-specific data formats — HL7, FHIR, CCD, and EDI X12.
• Integrate clinical, claims, and operational data from multiple sources (EHR, EMR, payer systems).
• Build and manage data workflows using Cloud Composer (Airflow) for orchestration.
• Collaborate with data analysts and data scientists to provide curated datasets for analytics and ML.
• Implement data quality, metadata management, and data governance frameworks.
• Ensure compliance with HIPAA and HITRUST standards in all data operations.
• Support DevOps processes, CI/CD pipelines, and infrastructure as code (Terraform, Cloud Build).
• Work in an Agile/Scrum environment and contribute to architecture reviews and technical designs.
Required Skills & Experience
• 8–10 years in data engineering, with at least 4+ years on GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Composer).
• Proven experience working with healthcare datasets (EHR, EMR, claims, payer, or clinical data).
• Strong expertise in Python (PySpark) and Advanced SQL for data transformations.
• Knowledge of FHIR/HL7 standards and healthcare data integration practices.
• Experience with data modeling, data lakes, and data warehouse architectures.
• Familiarity with data privacy laws and compliance frameworks (HIPAA, HITRUST).
• Strong knowledge of Terraform, Docker, and CI/CD tools for deploying data solutions.
• Proficiency with version control systems (Git) and Agile development tools (JIRA, Confluence).
Preferred Qualifications
• GCP Professional Data Engineer Certification.
• Experience in ML pipelines or integrating AI/ML models for clinical predictions.
• Knowledge of FHIR API integration and Google Cloud Healthcare API.
• Experience with Looker or Power BI for healthcare analytics reporting.
• Familiarity with AWS HealthLake or Azure Health Data Services (for cross-cloud interoperability).
Soft Skills
• Excellent communication and collaboration with clinical, technical, and compliance teams.
• Strong analytical mindset and attention to data accuracy.
• Ability to work autonomously in a fully remote U.S. team.
• Leadership in mentoring junior engineers and defining cloud data standards.
Tech Stack
Languages: Python, SQL, PySpark
GCP Tools: BigQuery, Dataflow, Pub/Sub, Dataproc, Composer, Healthcare API
Orchestration: Cloud Composer (Airflow)
Infra & CI/CD: Terraform, Git, Jenkins, Cloud Build
Data Formats: FHIR, HL7, CCD, JSON, CSV, Parquet
Job Title: Senior Data Engineer – Google Cloud Platform (GCP)
Location: Remote, USA
Experience Level: 8–10 years
Employment Type: Full-Time (W2 or 1099)
Compensation: Up to $120,000 per year + Benefits (based on experience)
About the Role
We are seeking an experienced Senior Data Engineer (GCP) with strong expertise in healthcare data systems to join our growing data engineering team. The ideal candidate will have a deep understanding of HIPAA-compliant architectures, FHIR/HL7 data standards, and GCP cloud-native data platforms. You’ll be responsible for building secure, scalable, and compliant data pipelines that power clinical analytics, population health, and healthcare operations.
Key Responsibilities
• Design, build, and maintain HIPAA-compliant ETL/ELT data pipelines using Dataflow, Dataproc, or Apache Beam.
• Develop robust and optimized data warehouses and marts using BigQuery and Cloud Storage.
• Work with healthcare-specific data formats — HL7, FHIR, CCD, and EDI X12.
• Integrate clinical, claims, and operational data from multiple sources (EHR, EMR, payer systems).
• Build and manage data workflows using Cloud Composer (Airflow) for orchestration.
• Collaborate with data analysts and data scientists to provide curated datasets for analytics and ML.
• Implement data quality, metadata management, and data governance frameworks.
• Ensure compliance with HIPAA and HITRUST standards in all data operations.
• Support DevOps processes, CI/CD pipelines, and infrastructure as code (Terraform, Cloud Build).
• Work in an Agile/Scrum environment and contribute to architecture reviews and technical designs.
Required Skills & Experience
• 8–10 years in data engineering, with at least 4+ years on GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Composer).
• Proven experience working with healthcare datasets (EHR, EMR, claims, payer, or clinical data).
• Strong expertise in Python (PySpark) and Advanced SQL for data transformations.
• Knowledge of FHIR/HL7 standards and healthcare data integration practices.
• Experience with data modeling, data lakes, and data warehouse architectures.
• Familiarity with data privacy laws and compliance frameworks (HIPAA, HITRUST).
• Strong knowledge of Terraform, Docker, and CI/CD tools for deploying data solutions.
• Proficiency with version control systems (Git) and Agile development tools (JIRA, Confluence).
Preferred Qualifications
• GCP Professional Data Engineer Certification.
• Experience in ML pipelines or integrating AI/ML models for clinical predictions.
• Knowledge of FHIR API integration and Google Cloud Healthcare API.
• Experience with Looker or Power BI for healthcare analytics reporting.
• Familiarity with AWS HealthLake or Azure Health Data Services (for cross-cloud interoperability).
Soft Skills
• Excellent communication and collaboration with clinical, technical, and compliance teams.
• Strong analytical mindset and attention to data accuracy.
• Ability to work autonomously in a fully remote U.S. team.
• Leadership in mentoring junior engineers and defining cloud data standards.
Tech Stack
Languages: Python, SQL, PySpark
GCP Tools: BigQuery, Dataflow, Pub/Sub, Dataproc, Composer, Healthcare API
Orchestration: Cloud Composer (Airflow)
Infra & CI/CD: Terraform, Git, Jenkins, Cloud Build
Data Formats: FHIR, HL7, CCD, JSON, CSV, Parquet






