

GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 5–7 years of experience, focusing on GCP services (Cloud Composer, Airflow), DBT (SQL & Python), and GitLab. Contract length is unspecified; pay rate is also unspecified. Remote work is available.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
August 1, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Cambridge, England, United Kingdom
-
🧠 - Skills detailed
#Data Quality #GitLab #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Data Analysis #Agile #Data Engineering #Python #Data Pipeline #Airflow #Scripting #Version Control #Scala #SQL (Structured Query Language) #Cloud #Data Architecture #dbt (data build tool)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
About the Role:
We are seeking a skilled and motivated Data Engineer with strong expertise in Google Cloud Platform (GCP) services, including Cloud Composer and Airflow, as well as proficiency in DBT (SQL & Python) and GitLab. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable data pipelines and solutions.
Key Responsibilities:
• Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
• Implement data transformation workflows using DBT with SQL and Python.
• Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
• Manage version control and CI/CD pipelines using GitLab.
• Optimize data workflows for performance, scalability, and reliability.
• Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Required Skills & Qualifications:
• 5–7 years of experience as a Data Engineer or in a similar role.
• Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
• Proficiency in DBT, SQL, and Python for data transformation and scripting.
• Experience with GitLab for version control and CI/CD.
• Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration abilities.
Preferred Qualifications:
• GCP certification(s) in data engineering or related areas.
• Experience working in agile development environments.
• Familiarity with other cloud platforms or data tools is a plus.