

Rapid Eagle Inc
Healthcare Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Healthcare Data Engineer, 100% remote, with a contract length of unspecified duration. Pay ranges from $55.00 to $80.00 per hour. Requires 8+ years of data engineering experience, expertise in AWS and Databricks, and healthcare data familiarity.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
February 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Data Ingestion #Compliance #Scala #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #SQL (Structured Query Language) #FHIR (Fast Healthcare Interoperability Resources) #PySpark #Spark SQL #Data Quality #Agile #Data Pipeline #Lambda (AWS Lambda) #Security #AWS S3 (Amazon Simple Storage Service) #Monitoring #Migration #Data Engineering #S3 (Amazon Simple Storage Service) #Databricks #PostgreSQL #Data Security #AWS (Amazon Web Services) #Delta Lake #Cloud #Data Architecture
Role description
Healthcare Data Engineer
100% Remote
Immediate Interview
Skills:-
Key Responsibilities:
? Lead the migration of AWS-hosted PostgreSQL workloads to Databricks Lakehouse architecture.
? Design and implement scalable ETL/ELT pipelines using Databricks (PySpark, SQL, Delta Lake) in AWS ecosystem
? Work with complex healthcare data formats, including X12 837 claims, EBCDIC files, and other structured/unstructured formats.
? Implement data masking, profiling, and governance solutions to ensure HIPAA-compliant data handling across ingestion, transformation, and consumption layers.
? Optimize data pipelines for performance, reliability, and cost-efficiency within the AWS ecosystem.
? Collaborate with data architects, analysts, and compliance teams to enforce data security best practices.
? Drive data quality monitoring and reconciliation processes in high-volume, sensitive data environments.
Must-Have Qualifications:
? 8+ years of hands-on data engineering experience with strong focus on AWS + Databricks environments.
? Proven track record of migrating PostgreSQL workloads from AWS to modern lakehouse platforms.
? Databricks Certified Data Engineer (Associate or Professional).
? Experience with healthcare data and regulatory formats including X12 837, HL7, EBCDIC files, etc.
? Deep expertise in data masking, governance frameworks, and working in HIPAA-regulated environments.
? Familiarity and working knowledge in adhering to CI/CD processes and practices in a cloud ecosystem.
? Proficiency in PySpark, SQL, Delta Lake, and data transformation in Databricks.
? Solid understanding of data ingestion from AWS (S3, Glue, Lambda) into Databricks.
Preferred Skills:
? Familiarity with FHIR, X12 837, HL7, or other clinical data standards.
? Experience with Unity Catalog and Lakehouse governance frameworks.
? Knowledge of Agile methodologies and CI/CD practices for data pipelines.
? This role is healthcare domain-focused, and prior experience working with payers, providers, or healthcare clearinghouses is highly desirable.
Pay: $55.00 - $80.00 per hour
Benefits:
Flexible schedule
Health insurance
Tuition reimbursement
Vision insurance
Work Location: Remote
Healthcare Data Engineer
100% Remote
Immediate Interview
Skills:-
Key Responsibilities:
? Lead the migration of AWS-hosted PostgreSQL workloads to Databricks Lakehouse architecture.
? Design and implement scalable ETL/ELT pipelines using Databricks (PySpark, SQL, Delta Lake) in AWS ecosystem
? Work with complex healthcare data formats, including X12 837 claims, EBCDIC files, and other structured/unstructured formats.
? Implement data masking, profiling, and governance solutions to ensure HIPAA-compliant data handling across ingestion, transformation, and consumption layers.
? Optimize data pipelines for performance, reliability, and cost-efficiency within the AWS ecosystem.
? Collaborate with data architects, analysts, and compliance teams to enforce data security best practices.
? Drive data quality monitoring and reconciliation processes in high-volume, sensitive data environments.
Must-Have Qualifications:
? 8+ years of hands-on data engineering experience with strong focus on AWS + Databricks environments.
? Proven track record of migrating PostgreSQL workloads from AWS to modern lakehouse platforms.
? Databricks Certified Data Engineer (Associate or Professional).
? Experience with healthcare data and regulatory formats including X12 837, HL7, EBCDIC files, etc.
? Deep expertise in data masking, governance frameworks, and working in HIPAA-regulated environments.
? Familiarity and working knowledge in adhering to CI/CD processes and practices in a cloud ecosystem.
? Proficiency in PySpark, SQL, Delta Lake, and data transformation in Databricks.
? Solid understanding of data ingestion from AWS (S3, Glue, Lambda) into Databricks.
Preferred Skills:
? Familiarity with FHIR, X12 837, HL7, or other clinical data standards.
? Experience with Unity Catalog and Lakehouse governance frameworks.
? Knowledge of Agile methodologies and CI/CD practices for data pipelines.
? This role is healthcare domain-focused, and prior experience working with payers, providers, or healthcare clearinghouses is highly desirable.
Pay: $55.00 - $80.00 per hour
Benefits:
Flexible schedule
Health insurance
Tuition reimbursement
Vision insurance
Work Location: Remote






