

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a GCP Data Engineer contract position in Houston, TX, requiring 10+ years of experience, GCP Professional Data Engineer certification, and expertise in Python, SAP SLT replication, and CI/CD practices. Onsite work from Day 1 is mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 7, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Terraform #GCP (Google Cloud Platform) #REST API #BigQuery #Data Access #Cloud #Storage #IAM (Identity and Access Management) #JSON (JavaScript Object Notation) #GitHub #SAP #Batch #Data Quality #Clustering #Data Ingestion #Monitoring #Replication #Python #DevOps #Security #Data Pipeline #Data Engineering #Dataflow #Datasets #Data Processing #Scala #Big Data #REST (Representational State Transfer) #Automation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: GCP Data Engineer
Position Type: Contract
Location: Houston, TX (Onsite from Day 1)
Job Summary
Looking for a certified GCP Data Engineer with hands-on experience building data pipelines on Google Cloud. Must have SAP SLT replication experience, strong Python and Dataflow skills, and be proficient in CI/CD (GitHub + Terraform). The ideal candidate can bridge business needs with technical execution in a secure, regulated environment. Houston-based professionals preferred. Certification is mandatory.
The role requires strong technical expertise in GCP services, data pipeline development, and hands-on experience with both batch and real-time processing. The ideal candidate will be adept at collaborating with stakeholders, translating business requirements into technical solutions, and ensuring the scalability and reliability of data systems.
Key Responsibilities
β’ Design, develop, test, and maintain data acquisition pipelines for large-scale structured and unstructured data.
β’ Build and manage complex datasets to support business needs.
β’ Develop scalable big data pipeline architectures in GCP.
β’ Collaborate with stakeholders to identify opportunities for new data acquisition and integration.
β’ Translate business needs into technical requirements and solutions.
β’ Work with GCP ecosystem tools, including Python, DataFlow, DataStream, CDC, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, and Cloud Storage.
β’ Implement monitoring solutions using logs and alerts for pipeline performance.
β’ Use SAP SLT to replicate SAP tables into GCP.
β’ Develop JSON-based messaging structures for application integration.
β’ Apply DevOps and CI/CD practices (GitHub, Terraform) for automation and scalability.
β’ Optimize datasets with partitioning, clustering, IAM roles, and policy tags for security and performance.
β’ Manage data access with roles, authorized views, and security policies.
β’ Build data ingestion pipelines leveraging REST APIs.
β’ Continuously recommend improvements in data quality, governance, and efficiency.
Required Skills & Experience
β’ 10+ years of professional experience as a data engineer.
β’ Enterprise-scale data solutions
β’ Strong expertise in the Google Cloud Platform (GCP) ecosystem.
β’ Proficiency in Python for data engineering tasks.
β’ Experience with batch and real-time data processing.
β’ Hands-on experience with SAP SLT replication.
β’ Strong understanding of DevOps and CI/CD practices (GitHub, Terraform).
β’ GCP Professional Data Engineer Certification required.