

Senior Data Architect – GCP| Lakehouse| AI/ML|Healthcare|
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with 13+ years of IT experience, focusing on GCP, Lakehouse technologies, and AI/ML in healthcare. Contract length is unspecified, with an immediate start in Nashville, TN. Key skills include Python, SQL, and data governance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Nashville, TN
-
🧠 - Skills detailed
#Data Engineering #Microsoft Power BI #BigQuery #Compliance #Delta Lake #Data Governance #Tableau #Kafka (Apache Kafka) #Apache Iceberg #SQL (Structured Query Language) #Data Architecture #AI (Artificial Intelligence) #Metadata #FHIR (Fast Healthcare Interoperability Resources) #ML (Machine Learning) #Cloud #GCP (Google Cloud Platform) #Python #Spark (Apache Spark) #Scala #Dataflow #Strategy #Looker #Data Strategy #BI (Business Intelligence) #Airflow #Apache Spark
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Architect – GCP | Lakehouse | AI/ML | Healthcare | 13+ yrs
🔹 Location: Nashville, TN
🔹 Employment Type: Contract
🔹 Start Date: Immediate
🔹 Domain: Healthcare
About the Role:
We are seeking a highly experienced Senior Data Architect to lead the architecture, design, and implementation of scalable, cloud-native data platforms using Google Cloud (GCP) and Lakehouse technologies. You’ll be responsible for shaping data strategy, optimizing pipelines for AI/ML use cases, and guiding data engineering teams in a healthcare-focused environment.
Must-Have Skills:
• Total IT Experience: Minimum 13+ years
• Data Architecture & Engineering: 8+ years
• GCP: 5+ years (BigQuery, Dataflow, Pub/Sub, Composer)
• Lakehouse Technologies: Delta Lake / Apache Iceberg / Hudi – hands-on
• Strong in Python, SQL, Apache Spark, Airflow
• Experience with MLOps, AI/ML model pipelines, streaming (Kafka or Pub/Sub)
• Experience integrating with BI tools (Looker, Tableau, Power BI)
• Hands-on with data governance, metadata, compliance (HIPAA/FHIR)
Nice to Have:
• GCP Certified (Data Engineer / Architect)
• Exposure to FHIR / HL7 or other healthcare data standards
• Experience with vector DBs (Pinecone, FAISS) and LLM pipelines
Do Not Apply If:
• You have less than 12 years total experience
• You are not leading architecture decisions or working independently with stakeholders
• You’ve only used GCP tools but haven’t built full-scale data platforms