

Coltech
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Engineer on a 12-month remote contract, paying $100–$120/hour. Key skills include GCP, BigQuery, and data pipeline development. Proven GCP experience and strong SQL skills are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
960
-
🗓️ - Date
April 24, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Batch #Data Pipeline #Terraform #Data Quality #GCP (Google Cloud Platform) #Data Processing #Storage #Data Engineering #Deployment #Scala #SQL (Structured Query Language) #Dataflow #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Cloud #Apache Beam #Datasets #BigQuery
Role description
🚀 Hiring: Senior GCP Data Engineer (Fully Remote | 12-Month Contract)
I’m looking for an experienced GCP Data Engineer to join a high-impact data transformation programme on a 12-month contract.
💰 $100–$120/hour
🌍 Fully Remote (UK/EU timezone overlap preferred)
📅 12 months (strong likelihood of extension)
🔍 The Role
You’ll be hands-on building and optimising a modern data platform on Google Cloud Platform (GCP), working closely with architects and stakeholders to deliver scalable data solutions.
This role is ideal for someone who enjoys building robust pipelines, working with large datasets, and solving real-world data challenges in GCP.
🧠 Key Responsibilities
• Build and maintain scalable data pipelines (batch & real-time)
• Work with large datasets in BigQuery, optimising performance and cost
• Develop data processing solutions using Dataflow (Apache Beam)
• Integrate data using Pub/Sub and Cloud Storage
• Support data modelling and transformation workflows
• Collaborate with engineers, analysts, and business stakeholders
• Follow best practices in data quality, testing, and deployment
☁️ Tech Stack (GCP-focused)
BigQuery · Dataflow (Apache Beam) · Pub/Sub · Cloud Storage · Cloud Composer / Dataform
- Strong preference for candidates with hands-on GCP experience (not just general cloud exposure)
✅ Requirements
• Proven experience as a Data Engineer working on GCP
• Strong experience building data pipelines at scale
• Experience with both batch and real-time processing
• Solid SQL skills and experience with data warehousing (BigQuery)
• Comfortable working in a fast-paced, collaborative environment
⭐ Nice to Have
• Experience migrating data platforms to GCP
• BigQuery cost optimisation experience
• Exposure to CI/CD or infrastructure as code (e.g., Terraform)
📩 Interested?
Drop me a message or comment below, and I’ll reach out with more details.
#Hiring #DataEngineer #GCP #GoogleCloud #BigData #RemoteJobs #ContractJobs
🚀 Hiring: Senior GCP Data Engineer (Fully Remote | 12-Month Contract)
I’m looking for an experienced GCP Data Engineer to join a high-impact data transformation programme on a 12-month contract.
💰 $100–$120/hour
🌍 Fully Remote (UK/EU timezone overlap preferred)
📅 12 months (strong likelihood of extension)
🔍 The Role
You’ll be hands-on building and optimising a modern data platform on Google Cloud Platform (GCP), working closely with architects and stakeholders to deliver scalable data solutions.
This role is ideal for someone who enjoys building robust pipelines, working with large datasets, and solving real-world data challenges in GCP.
🧠 Key Responsibilities
• Build and maintain scalable data pipelines (batch & real-time)
• Work with large datasets in BigQuery, optimising performance and cost
• Develop data processing solutions using Dataflow (Apache Beam)
• Integrate data using Pub/Sub and Cloud Storage
• Support data modelling and transformation workflows
• Collaborate with engineers, analysts, and business stakeholders
• Follow best practices in data quality, testing, and deployment
☁️ Tech Stack (GCP-focused)
BigQuery · Dataflow (Apache Beam) · Pub/Sub · Cloud Storage · Cloud Composer / Dataform
- Strong preference for candidates with hands-on GCP experience (not just general cloud exposure)
✅ Requirements
• Proven experience as a Data Engineer working on GCP
• Strong experience building data pipelines at scale
• Experience with both batch and real-time processing
• Solid SQL skills and experience with data warehousing (BigQuery)
• Comfortable working in a fast-paced, collaborative environment
⭐ Nice to Have
• Experience migrating data platforms to GCP
• BigQuery cost optimisation experience
• Exposure to CI/CD or infrastructure as code (e.g., Terraform)
📩 Interested?
Drop me a message or comment below, and I’ll reach out with more details.
#Hiring #DataEngineer #GCP #GoogleCloud #BigData #RemoteJobs #ContractJobs






