

SGI
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Houston, TX, for 12 months at a pay rate of "unknown." Requires 10+ years in software/data engineering and 3-4+ years with GCP, focusing on BigQuery, Python, SQL, and dbt.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#dbt (data build tool) #"ETL (Extract #Transform #Load)" #Compliance #Storage #GCP (Google Cloud Platform) #IAM (Identity and Access Management) #Data Transformations #Python #Security #Cloud #BigQuery #Scala #Data Engineering #Data Science #GitHub #Data Pipeline #Data Quality #Batch #DevOps #SQL (Structured Query Language) #Google Cloud Storage
Role description
GCP Data Engineer
Location: Houston, TX
Work Model: 100% Onsite
Duration: 12 months - will extend
Overview
We are seeking a highly experienced GCP Data Engineer to join a data-driven team supporting large-scale analytics and cloud-native data platforms. This role requires deep hands-on experience with Google Cloud Platform services, strong Python and SQL skills, and the ability to build, deploy, and maintain robust data pipelines in a production environment.
Key Responsibilities
• Design, build, and maintain scalable data pipelines and workflows on Google Cloud Platform
• Develop and optimize data solutions using BigQuery and Google Cloud Storage (GCS)
• Implement event-driven and batch processing using Pub/Sub
• Deploy and manage services using Cloud Run, Cloud Functions, and Cloud Composer
• Collaborate with DevOps teams to support CI/CD pipelines and cloud infrastructure best practices
• Manage service accounts, IAM permissions, and Secrets Manager configurations
• Build and maintain data transformations using dbt
• Support analytics and reporting use cases, including integrations with Power Apps
• Ensure data quality, performance optimization, and security compliance
• Work closely with cross-functional teams including data scientists, analysts, and application developers
Required Qualifications
• 10+ years of overall software or data engineering experience
• 3–4+ years of hands-on experience with the GCP data stack
Strong expertise in:
• BigQuery, GCS
• Cloud Run, Cloud Functions, Cloud Composer
• Pub/Sub
• Proficient in Python and SQL
• Experience with GitHub and modern DevOps practices
• Solid understanding of cloud security concepts, including service accounts and secrets management
• Hands-on experience with dbt
• Familiarity with Power Apps
• Google certifications
Preferred Skills
• Experience working in highly regulated or enterprise environments
• Strong troubleshooting and performance tuning skills
GCP Data Engineer
Location: Houston, TX
Work Model: 100% Onsite
Duration: 12 months - will extend
Overview
We are seeking a highly experienced GCP Data Engineer to join a data-driven team supporting large-scale analytics and cloud-native data platforms. This role requires deep hands-on experience with Google Cloud Platform services, strong Python and SQL skills, and the ability to build, deploy, and maintain robust data pipelines in a production environment.
Key Responsibilities
• Design, build, and maintain scalable data pipelines and workflows on Google Cloud Platform
• Develop and optimize data solutions using BigQuery and Google Cloud Storage (GCS)
• Implement event-driven and batch processing using Pub/Sub
• Deploy and manage services using Cloud Run, Cloud Functions, and Cloud Composer
• Collaborate with DevOps teams to support CI/CD pipelines and cloud infrastructure best practices
• Manage service accounts, IAM permissions, and Secrets Manager configurations
• Build and maintain data transformations using dbt
• Support analytics and reporting use cases, including integrations with Power Apps
• Ensure data quality, performance optimization, and security compliance
• Work closely with cross-functional teams including data scientists, analysts, and application developers
Required Qualifications
• 10+ years of overall software or data engineering experience
• 3–4+ years of hands-on experience with the GCP data stack
Strong expertise in:
• BigQuery, GCS
• Cloud Run, Cloud Functions, Cloud Composer
• Pub/Sub
• Proficient in Python and SQL
• Experience with GitHub and modern DevOps practices
• Solid understanding of cloud security concepts, including service accounts and secrets management
• Hands-on experience with dbt
• Familiarity with Power Apps
• Google certifications
Preferred Skills
• Experience working in highly regulated or enterprise environments
• Strong troubleshooting and performance tuning skills






