

Panacea Direct Inc
Enterprise Data Architect (GCP)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Enterprise Data Architect (GCP) on a contract basis, remote, paying $70.00 - $80.00 per hour. Required skills include Google Cloud, Apache Iceberg, Delta Lake, and healthcare interoperability standards. 20+ years of experience is necessary.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date
November 11, 2025
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Security #Trino #BigQuery #Spark (Apache Spark) #Databricks #Data Architecture #Metadata #Data Quality #Delta Lake #Data Engineering #Strategy #Storage #Cloud #GCP (Google Cloud Platform) #Data Catalog #Data Management #Scala #Dataflow #Kafka (Apache Kafka) #Data Strategy #Presto #Compliance #Apache Iceberg #Data Modeling #Apache Kafka #FHIR (Fast Healthcare Interoperability Resources) #Leadership #Observability
Role description
Position β Enterprise Data Architect (GCP)
Location β Remote
Duration β Contract
Skills: -
Β· Mandatory skills Cloud Platform: Google Cloud (BigQuery, Dataflow, Dataproc).
Β· Lakehouse Frameworks: Apache Iceberg, Delta Lake, Hudi.
Β· Streaming & Event-Driven Architecture: Apache Kafka, Google Pub/Sub, Apache Flink.
Β· Query Federation: Starburst, Presto, Trino.
Β· Data Modeling: Advanced modeling techniques, schema evolution.
Β· Governance & Security: Metadata management and data quality frameworks.
Β· Architecture Principles: Event sourcing, CQRS, Data Mesh
Good to have skills: -
Β· Experience with Databricks Lakehouse
Β· Understanding of Data Catalog and Data Observability tools like Atlan, Monte Carlo, etc.
Β· Healthcare Knowledge - Understanding and exposure of Healthcare Interoperability standards like HL7, CCDA, FHIR, etc. Experience in implementation of Clinical and Claims Data Analytics in a large Payor Ecosystem
Responsibilities: -
Architectural Design
Β· Define and implement Open Lakehouse Architecture leveraging technologies like Apache Iceberg and Delta Lake
Β· Design Event-Driven Architecture using streaming platforms such as Apache Kafka, Google Pub/Sub, and Apache Flink
Cloud & Platform Expertise
Β· Architect solutions on Google Cloud Platform using services like Dataproc, Dataflow, and Cloud Storage.
Β· Integrate Starburst for federated query capabilities across heterogeneous data sources.
Data Strategy & Governance
Β· Establish enterprise-wide data standards, metadata management, and data quality frameworks.
Β· Ensure compliance with security, privacy, and regulatory requirements.
Integration & Scalability
Β· Develop strategies for integrating structured, semi-structured, and unstructured data from multiple sources.
Β· Optimize for performance, cost, and elastic scalability in cloud and hybrid environments.
Collaboration & Leadership
Β· Partner with business stakeholders, data engineers, and application architects to align architecture with business goals.
Β· Mentor teams on modern data architecture patterns and best practice
Technical Expertise
Β· Strong knowledge of Open Lakehouse frameworks (Iceberg, Delta Lake, Hudi).
Β· Proficiency in streaming technologies (Kafka, Flink, Spark Structured Streaming, Google Pub/Sub).
Β· Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Dataproc).
Β· Experience with Starburst or similar query federation tools.
Β· Expertise in data modeling, partitioning strategies, and schema evolution.
Β· Familiarity with object storage (GCS) and query engines (Presto, Trino)
Experience: -
Β· 20+ Years
Job Type: Contract
Pay: $70.00 - $80.00 per hour
Expected hours: 40 per week
Work Location: Remote
Position β Enterprise Data Architect (GCP)
Location β Remote
Duration β Contract
Skills: -
Β· Mandatory skills Cloud Platform: Google Cloud (BigQuery, Dataflow, Dataproc).
Β· Lakehouse Frameworks: Apache Iceberg, Delta Lake, Hudi.
Β· Streaming & Event-Driven Architecture: Apache Kafka, Google Pub/Sub, Apache Flink.
Β· Query Federation: Starburst, Presto, Trino.
Β· Data Modeling: Advanced modeling techniques, schema evolution.
Β· Governance & Security: Metadata management and data quality frameworks.
Β· Architecture Principles: Event sourcing, CQRS, Data Mesh
Good to have skills: -
Β· Experience with Databricks Lakehouse
Β· Understanding of Data Catalog and Data Observability tools like Atlan, Monte Carlo, etc.
Β· Healthcare Knowledge - Understanding and exposure of Healthcare Interoperability standards like HL7, CCDA, FHIR, etc. Experience in implementation of Clinical and Claims Data Analytics in a large Payor Ecosystem
Responsibilities: -
Architectural Design
Β· Define and implement Open Lakehouse Architecture leveraging technologies like Apache Iceberg and Delta Lake
Β· Design Event-Driven Architecture using streaming platforms such as Apache Kafka, Google Pub/Sub, and Apache Flink
Cloud & Platform Expertise
Β· Architect solutions on Google Cloud Platform using services like Dataproc, Dataflow, and Cloud Storage.
Β· Integrate Starburst for federated query capabilities across heterogeneous data sources.
Data Strategy & Governance
Β· Establish enterprise-wide data standards, metadata management, and data quality frameworks.
Β· Ensure compliance with security, privacy, and regulatory requirements.
Integration & Scalability
Β· Develop strategies for integrating structured, semi-structured, and unstructured data from multiple sources.
Β· Optimize for performance, cost, and elastic scalability in cloud and hybrid environments.
Collaboration & Leadership
Β· Partner with business stakeholders, data engineers, and application architects to align architecture with business goals.
Β· Mentor teams on modern data architecture patterns and best practice
Technical Expertise
Β· Strong knowledge of Open Lakehouse frameworks (Iceberg, Delta Lake, Hudi).
Β· Proficiency in streaming technologies (Kafka, Flink, Spark Structured Streaming, Google Pub/Sub).
Β· Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Dataproc).
Β· Experience with Starburst or similar query federation tools.
Β· Expertise in data modeling, partitioning strategies, and schema evolution.
Β· Familiarity with object storage (GCS) and query engines (Presto, Trino)
Experience: -
Β· 20+ Years
Job Type: Contract
Pay: $70.00 - $80.00 per hour
Expected hours: 40 per week
Work Location: Remote






