

Jade Business Services (JBS)
GCP DATA ENGINEER
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a contract basis in Houston, TX (Hybrid). Requires 10+ years of experience, expertise in Google Cloud tools, and skills in Python, BigQuery, and DevOps practices.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 25, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Datasets #BigQuery #Terraform #SAP #Batch #Data Quality #REST API #Cloud #Storage #Data Pipeline #Scala #DevOps #JSON (JavaScript Object Notation) #REST (Representational State Transfer) #Python #Data Engineering #IAM (Identity and Access Management) #API (Application Programming Interface) #GCP (Google Cloud Platform) #GitHub #Big Data
Role description
Title : GCP Data Engineer
Hire Type : Contract
Location :Houston, TX ( Hybrid)
Job Description:
Job description:
10+ years of total experience.
β’ Develop, construct, test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud)
β’ Build large and complex datasets based on business requirements
β’ Construct βbig dataβ pipeline architecture
β’ Identify opportunities for data acquisition via working with stakeholders and business clients
β’ Translate business needs to technical requirements
β’ Leverage a variety of tools in the Google Cloud Ecosystem such as Python, Data Flow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub Sub, BigQuery, Cloud Storage to integrate systems and data pipelines
β’ Use logs & alerts to effectively monitor pipelines
β’ Use SAP SLT to replicate SAP tables to Google Cloud using SLT
β’ Develop JSON messaging structure for integrating with various applications
β’ Leverage DevOps and CI/CD practices (GitHub, Terraform) to ensure the reliability and scalability of data pipelines
β’ Partition/Cluster and retrieve content in Big Query and use IAM roles & Policy Tags to secure the data
β’ Use roles to secure access to datasets, authorized views to share data between projects
β’ Design and build an ingestion pipeline using Rest API
β’ Recommends ways to improve data quality, reliability, and efficiency
Title : GCP Data Engineer
Hire Type : Contract
Location :Houston, TX ( Hybrid)
Job Description:
Job description:
10+ years of total experience.
β’ Develop, construct, test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud)
β’ Build large and complex datasets based on business requirements
β’ Construct βbig dataβ pipeline architecture
β’ Identify opportunities for data acquisition via working with stakeholders and business clients
β’ Translate business needs to technical requirements
β’ Leverage a variety of tools in the Google Cloud Ecosystem such as Python, Data Flow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub Sub, BigQuery, Cloud Storage to integrate systems and data pipelines
β’ Use logs & alerts to effectively monitor pipelines
β’ Use SAP SLT to replicate SAP tables to Google Cloud using SLT
β’ Develop JSON messaging structure for integrating with various applications
β’ Leverage DevOps and CI/CD practices (GitHub, Terraform) to ensure the reliability and scalability of data pipelines
β’ Partition/Cluster and retrieve content in Big Query and use IAM roles & Policy Tags to secure the data
β’ Use roles to secure access to datasets, authorized views to share data between projects
β’ Design and build an ingestion pipeline using Rest API
β’ Recommends ways to improve data quality, reliability, and efficiency






