Envision Technology Solutions

GCP Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a GCP Data Architect based in Davidson, NC, for a contract duration. It requires strong expertise in GCP, Databricks, Looker, and SQL, along with experience in data architecture and engineering. Preferred certifications include Google Professional Data Engineer and Databricks Certified Data Engineer.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Davidson, NC
-
🧠 - Skills detailed
#Programming #DevOps #Infrastructure as Code (IaC) #Data Ingestion #Cloud #Databricks #Python #Storage #Security #Data Pipeline #Strategy #Compliance #Terraform #Monitoring #BI (Business Intelligence) #Data Engineering #Deployment #BigQuery #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Governance #Data Architecture #Dataflow #GCP (Google Cloud Platform) #Looker #Semantic Models #Scala
Role description
Position:  GCP Data Architect Location:  Davidson, NC (Onsite) Duration: Contract Job Description: Experienced Data Platform Architect to design and manage enterprisegrade data solutions This role will focus on building and optimizing data platforms leveraging Google Cloud Platform GCP Databricks Looker and Dataform ensuring scalability security and performance across all datadriven applications Key Responsibilities: Data Architecture Strategy o Design and implement modern data platform architecture for enterprise applications o Ensure seamless integration of Dataform Looker Databricks and GCP services Cloud Data Platform o Architect solutions using Google Cloud Platform components BigQuery Dataflow Cloud Storage PubSub o Optimize data pipelines for performance and cost efficiency Analytics BI o Develop semantic models and dashboards in Looker o Enable selfservice analytics and governance Data Engineering o Build and maintain data workflows using Dataform and Databricks o Implement best practices for data ingestion transformation and orchestration DevOps for Data o Establish CICD pipelines for data workflows o Automate deployments and monitoring for data services Governance Security o Define and enforce data governance security and compliance standards Required Skills Experience Cloud Expertise Strong experience with GCP Data Platform Data Engineering Tools Proficiency in Databricks Dataform and SQLbased transformations BI Tools Advanced knowledge of Looker Programming Python SQL Spark DevOps Familiarity with CICD tools and Infrastructure as Code Terraform Architecture Ability to design endtoend data solutions for largescale enterprise environments Preferred Qualifications Google Professional Data Engineer or Cloud Architect certification Databricks Certified Data Engineer Experience with modern data governance frameworks Exposure to AIML and advanced analytics