Zeus Solutions Inc

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a GCP Data Engineer contract position in Houston, TX, requiring 10+ years of IT experience and 3–4+ years with GCP. Key skills include BigQuery, Cloud Functions, Python, SQL, and DBT. Onsite work is mandatory.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX 77027
-
🧠 - Skills detailed
#dbt (data build tool) #"ETL (Extract #Transform #Load)" #Storage #GCP (Google Cloud Platform) #Deployment #IAM (Identity and Access Management) #Python #Security #Cloud #BigQuery #Scala #Data Engineering #GitHub #Data Pipeline #Data Quality #DevOps #SQL (Structured Query Language) #Google Cloud Storage #Data Modeling
Role description
Job Title: GCP Data Engineer Location: Houston, TX (100% Onsite)Experience: 10+ years total | 3–4+ years hands-on with GCP Job Overview We are looking for a seasoned GCP Data Engineer to design, build, and maintain scalable data pipelines and cloud-native solutions on Google Cloud Platform. This role is fully onsite in Houston and requires strong hands-on expertise across GCP services, data engineering best practices, and modern DevOps workflows. Key Responsibilities Design, develop, and maintain data pipelines using Google Cloud Platform Build and optimize solutions using BigQuery and Google Cloud Storage (GCS) Develop cloud-native services using Cloud Run, Cloud Functions, and/or Cloud Composer Implement event-driven architectures using Pub/Sub Write clean, efficient, and scalable code in Python and SQL Manage source control, CI/CD pipelines, and deployments using GitHub and DevOps practices Configure and manage Service Accounts, IAM, and Secrets Manager Develop and maintain data transformation models using DBT Collaborate with analytics and business teams, including integrations with Power Apps Troubleshoot performance, reliability, and data quality issues across the stack Required Skills & Qualifications 10+ years of overall IT or data engineering experience 3–4+ years of hands-on experience with GCP Strong expertise in: BigQuery GCS Cloud Run / Cloud Composer Cloud Functions Pub/Sub Advanced Python and SQL skills Experience with GitHub, CI/CD, and DevOps best practices Solid understanding of GCP security concepts (IAM, service accounts, secrets) Hands-on experience with DBT Experience working with or supporting Power Apps Nice to Have Experience designing high-volume, low-latency data pipelines Strong understanding of data modeling and analytics workloads Prior experience in enterprise or large-scale cloud environments Job Type: Contract Application Question(s): Do you have 10 or more years of total professional experience? Do you have 3–4 years of hands-on experience with the Google Cloud Platform (GCP)? Are you currently located in, or able to work 100% onsite in Houston, TX? Do you have hands-on experience with BigQuery and Google Cloud Storage (GCS)? Do you have experience with Cloud Run or Cloud Composer? Do you have experience with Cloud Functions and Pub/Sub? Do you have professional experience with Python and SQL? Do you have experience using GitHub and DevOps practices? Do you have experience setting up GCP Service Accounts and Secrets Manager? Do you have hands-on experience with DBT and Power Apps? Location: Houston, TX 77027 (Required) Ability to Commute: Houston, TX 77027 (Required) Work Location: In person