

Data Platform Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Platform Architect in Dallas, TX, with a contract length of "unknown" and a pay rate of "unknown." Key skills include expertise in GCP services, Terraform, Docker, Kubernetes, and data pipeline design. Required certifications: Google Cloud Professional Cloud Architect and Data Engineer.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
May 22, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Terraform #AI (Artificial Intelligence) #dbt (data build tool) #BigQuery #API (Application Programming Interface) #GitLab #Batch #TensorFlow #Model Deployment #Cloud #GCP (Google Cloud Platform) #Spark (Apache Spark) #Compliance #Data Engineering #Kafka (Apache Kafka) #Deployment #Security #Apache Spark #Documentation #"ETL (Extract #Transform #Load)" #Storage #Infrastructure as Code (IaC) #Containers #Data Pipeline #Airflow #Data Governance #Dataflow #IAM (Identity and Access Management) #PyTorch #Docker #Kubernetes #ML (Machine Learning) #Apache Beam #Data Processing #Data Security
Role description
Job Role: Data Platform Architect
Job Location: Dallas, TX
Core Responsibilities
Establish data flow patterns across batch, streaming, and ML use cases
Create decision frameworks for GCP technology selection
Develop reference architectures for real-time analytics and ML model deployment
Implement automated architecture validation in CI/CD pipelines
Design GitOps workflows for infrastructure and application deployments
Create standardized deployment patterns for different workload types
Develop IAM role assignments based on workload type
Create internal knowledge bases and documentation repositories
Lead training programs for data engineers and analysts
Technical Skills
GCP Services: Expert in BigQuery, Dataproc, Dataflow, Data Fusion, Pub/Sub, Cloud Storage, Cloud Run
Infrastructure as Code: Advanced Terraform skills with module development
Containers & Orchestration: Deep experience with Docker, Kubernetes, and GKE
CI/CD: Proficient with Cloud Build and GitLab CI/CD pipelines
Data Security: Strong understanding of data governance, security, and compliance
Data Processing: Expert in data pipeline design using Apache Spark, Apache Beam
Machine Learning Ops: Working knowledge of Vertex AI, AI Platform, TensorFlow/PyTorch deployment
Streaming: Experience with Kafka and Pub/Sub architectures
API Design: RESTful API design with authentication and authorization patterns
Workflow Management: Experience with Astronomer/Airflow for orchestration
Data Transformation: DBT implementation experience
GitOps: Proficient with GitOps principles and tools (ArgoCD, Flux)
Soft Skills
Strategic thinking and architectural vision
Advanced communication and stakeholder management
Ability to balance technical excellence with business needs
Strong mentorship and knowledge sharing capabilities
Collaborative approach to architectural decisions
Certifications
Google Cloud Professional Cloud Architect (Required)
Google Cloud Professional Data Engineer (Required)
Google Cloud Professional Security Engineer (Recommended)
Kubernetes CKA/CKAD (Recommended)
Terraform Associate (Recommended)
Job Role: Data Platform Architect
Job Location: Dallas, TX
Core Responsibilities
Establish data flow patterns across batch, streaming, and ML use cases
Create decision frameworks for GCP technology selection
Develop reference architectures for real-time analytics and ML model deployment
Implement automated architecture validation in CI/CD pipelines
Design GitOps workflows for infrastructure and application deployments
Create standardized deployment patterns for different workload types
Develop IAM role assignments based on workload type
Create internal knowledge bases and documentation repositories
Lead training programs for data engineers and analysts
Technical Skills
GCP Services: Expert in BigQuery, Dataproc, Dataflow, Data Fusion, Pub/Sub, Cloud Storage, Cloud Run
Infrastructure as Code: Advanced Terraform skills with module development
Containers & Orchestration: Deep experience with Docker, Kubernetes, and GKE
CI/CD: Proficient with Cloud Build and GitLab CI/CD pipelines
Data Security: Strong understanding of data governance, security, and compliance
Data Processing: Expert in data pipeline design using Apache Spark, Apache Beam
Machine Learning Ops: Working knowledge of Vertex AI, AI Platform, TensorFlow/PyTorch deployment
Streaming: Experience with Kafka and Pub/Sub architectures
API Design: RESTful API design with authentication and authorization patterns
Workflow Management: Experience with Astronomer/Airflow for orchestration
Data Transformation: DBT implementation experience
GitOps: Proficient with GitOps principles and tools (ArgoCD, Flux)
Soft Skills
Strategic thinking and architectural vision
Advanced communication and stakeholder management
Ability to balance technical excellence with business needs
Strong mentorship and knowledge sharing capabilities
Collaborative approach to architectural decisions
Certifications
Google Cloud Professional Cloud Architect (Required)
Google Cloud Professional Data Engineer (Required)
Google Cloud Professional Security Engineer (Recommended)
Kubernetes CKA/CKAD (Recommended)
Terraform Associate (Recommended)