

Sr Data Architect(GCP – Lakehouse, AI/ML)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Architect (GCP – Lakehouse, AI/ML) in Nashville, TN, on a contract basis with immediate start. Requires 8+ years in data architecture, 5+ years in GCP, and healthcare experience preferred. Pay rate is "unknown."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 24, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Nashville, TN
-
🧠 - Skills detailed
#Data Lifecycle #Data Lake #Data Management #Documentation #GCP (Google Cloud Platform) #Delta Lake #AI (Artificial Intelligence) #Project Management #Data Ingestion #Data Engineering #ML (Machine Learning) #Airflow #Apache Beam #Metadata #Python #Scrum #Cloud #"ACID (Atomicity #Consistency #Isolation #Durability)" #Data Strategy #React #Data Architecture #Indexing #Looker #Dataflow #Agile #Spark (Apache Spark) #Apache Spark #BI (Business Intelligence) #Data Quality #Compliance #Data Governance #DevOps #FHIR (Fast Healthcare Interoperability Resources) #Scala #Microsoft Power BI #Storage #Apache Iceberg #Strategy #Tableau #Kafka (Apache Kafka) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Modeling #Data Processing #Data Pipeline #Security #Leadership #BigQuery
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position Title: Senior Data Architect (GCP – Lakehouse, AI/ML)
Location: Nashville TN
Industry: Healthcare
Employment Type: Contract
Start Date: Immediate
About the Role
We are seeking an experienced and highly skilled Data Architect to join our dynamic team and lead the development of next-generation cloud-based data platforms.
This role is ideal for a strategic, hands-on technical leader with deep expertise in Google Cloud Platform (GCP), Lakehouse architectures, and data engineering.
You will help shape the future of data strategy in a leading healthcare organization focused on data-driven decision-making, operational efficiency, and better patient outcomes.
Key Responsibilities
Architecture & Technical Leadership
Design and implement scalable, high-performance, cost-effective data architecture solutions using GCP technologies: BigQuery, Dataflow, Dataproc, Cloud Spanner, Pub/Sub, GCS, Vertex AI.
Architect and manage data lakes/warehouses, with strong emphasis on Lakehouse principles and technologies: Delta Lake, Apache Iceberg, Hudi.
Lead the development of data ingestion, transformation (ETL/ELT) pipelines across structured and unstructured data sources.
Governance, Standards, and Strategy
Define and enforce data architecture best practices, including data governance, security, retention, and compliance.
Develop documentation and artifacts to illustrate the data lifecycle, from ingestion through consumption.
Provide thought leadership and contribute to enterprise-wide data strategy initiatives.
Guide and mentor data engineers and junior architects.
Collaboration & Stakeholder Engagement
Work with business stakeholders to translate strategic goals into practical data solutions.
Collaborate cross-functionally with software engineers, DevOps, product teams, and analysts to ensure data systems meet end-user needs.
Maintain strong communication with data governance, compliance, and security teams.
Required Skills & Experience
8+ years of experience in data architecture, engineering, and data management.
5+ years of GCP experience, including BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer.
Proven experience designing Lakehouse architectures using Delta Lake, Iceberg, or Hudi.
Strong knowledge of schema evolution, data partitioning, indexing, ACID compliance, and distributed file systems.
Proficient in Python, SQL, and familiarity with Apache Spark, Airflow, and CI/CD pipelines.
Deep understanding of MLOps, real-time data processing, and integrating AI/ML into data workflows.
Strong analytical and problem-solving skills with a business mindset.
Familiar with BI/AI tools and their integration with modern data platforms (e.g., Looker, Power BI, Tableau, Vertex AI).
Hands-on experience with data modeling, metadata management, and data quality frameworks.
Experience in Agile/Scrum environments.
Preferred Qualifications
Google Cloud Certification (e.g., Professional Data Engineer, Cloud Architect).
Experience in healthcare or regulated data environments.
Exposure to FHIR, HL7, or other healthcare data standards.
Experience with Apache Beam, Kafka, or other streaming platforms.
Familiarity with React, Dash, or front-end tools for visualizing data pipelines (a plus).
Core Competencies
Excellent communication and interpersonal skills.
Strategic thinking and technology foresight.
Strong project management and multitasking capabilities.
Ability to work independently and drive outcomes across teams.