

GCP Data Architect – Cloud Data Engineering | Healthcare|
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect – Cloud Data Engineering in Nashville, TN, offering a contract with an immediate start at $60.00 - $70.00 per hour. Requires 8+ years in data architecture, 5+ years with GCP tools, and healthcare experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date discovered
July 25, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Nashville, TN 37201
-
🧠 - Skills detailed
#Data Engineering #Storage #FHIR (Fast Healthcare Interoperability Resources) #Data Pipeline #Cloud #Microsoft Power BI #Delta Lake #Dataflow #BI (Business Intelligence) #Data Management #Spark (Apache Spark) #Metadata #Scala #Data Architecture #GCP (Google Cloud Platform) #Compliance #Apache Iceberg #PySpark #SQL (Structured Query Language) #Tableau #"ETL (Extract #Transform #Load)" #Airflow #Batch #BigQuery #Data Governance #DevOps #NLP (Natural Language Processing) #ML (Machine Learning) #AI (Artificial Intelligence) #Looker #Security #Spark SQL #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
GCP Data Architect – Cloud Data Engineering | Healthcare |Location: Nashville, TNJob Type: ContractStart Date: Immediate
Job Overview
We’re seeking a hands-on and strategic GCP Data Architect to lead the design and implementation of modern, scalable data platforms within the healthcare industry. The ideal candidate will bring deep cloud data engineering experience, especially on Google Cloud Platform (GCP) using BigQuery, Dataflow, Dataproc, and Pub/Sub, with a passion for building Lakehouse architectures and enabling AI/ML-driven decision systems.
Key Responsibilities
Design and implement Lakehouse architecture using Delta Lake, Iceberg, or Hudi
Build and maintain data pipelines (ETL/ELT) across batch and streaming systems
Work with GCP technologies: BigQuery, Dataflow, Dataproc, Pub/Sub, GCS, Cloud Composer
Enable real-time processing and AI/ML pipelines using MLOps best practices
Ensure data governance, metadata management, lineage, security (HIPAA)
Collaborate across DevOps, analytics, compliance, and product teams
Must-Have Qualifications
8+ years of experience in data architecture and engineering
5+ years hands-on experience with GCP tools: BigQuery, Cloud Storage, Dataflow, Dataproc
Proven experience building Lakehouse data platforms
Strong skills in Python, Spark (PySpark), SQL, Airflow
Experience working in healthcare data environments (HIPAA, FHIR, HL7)
Qualifications
Google Cloud Certification (e.g., Professional Data Engineer)
Experience with BI tools: Tableau, Power BI, Looker
Background in NLP, MLOps, or integrating LLM/GenAI pipelines
Job Type: Contract
Pay: $60.00 - $70.00 per hour
Application Question(s):
Following GCP tools have you worked with?
BigQuery, Dataflow, Pub/Sub, Cloud Composer, Dataproc
Have you implemented a Lakehouse architecture (Delta Lake, Apache Iceberg, or Hudi)?
Do you have healthcare data experience (HIPAA, FHIR, HL7)?
Are you currently Google Cloud certified?
Experience:
Google Cloud Platform (GCP)?: 5 years (Required)
Ability to Commute:
Nashville, TN 37201 (Required)
Work Location: In person