

TAGMATIX360
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a contract basis in Cambridge, UK. Requires 5–7 years of experience, expertise in GCP (Cloud Composer, Airflow), DBT (SQL & Python), and GitLab. GCP certification preferred. Hybrid work environment.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cambridge, England, United Kingdom
-
🧠 - Skills detailed
#GitLab #GCP (Google Cloud Platform) #Cloud #Data Engineering #Data Pipeline #Data Quality #Data Analysis #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Airflow #Version Control #Scala #Agile #Data Architecture #Python #Scripting #dbt (data build tool)
Role description
Position: Data Engineer
Location: (Hybrid) Cambridge, UK
Job Type: Contract job
About the Role:
We are seeking a skilled and motivated Data Engineer with strong expertise in Google Cloud Platform (GCP) services, including Cloud Composer and Airflow, as well as proficiency in DBT (SQL & Python) and GitLab. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable data pipelines and solutions.
Key Responsibilities:
• Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
• Implement data transformation workflows using DBT with SQL and Python.
• Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
• Manage version control and CI/CD pipelines using GitLab.
• Optimize data workflows for performance, scalability, and reliability.
• Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Required Skills & Qualifications:
• 5–7 years of experience as a Data Engineer or in a similar role.
• Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
• Proficiency in DBT, SQL, and Python for data transformation and scripting.
• Experience with GitLab for version control and CI/CD.
• Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration abilities.
Preferred Qualifications:
• GCP certification(s) in data engineering or related areas.
• Experience working in agile development environments.
• Familiarity with other cloud platforms or data tools is a plus.
Position: Data Engineer
Location: (Hybrid) Cambridge, UK
Job Type: Contract job
About the Role:
We are seeking a skilled and motivated Data Engineer with strong expertise in Google Cloud Platform (GCP) services, including Cloud Composer and Airflow, as well as proficiency in DBT (SQL & Python) and GitLab. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable data pipelines and solutions.
Key Responsibilities:
• Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
• Implement data transformation workflows using DBT with SQL and Python.
• Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
• Manage version control and CI/CD pipelines using GitLab.
• Optimize data workflows for performance, scalability, and reliability.
• Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Required Skills & Qualifications:
• 5–7 years of experience as a Data Engineer or in a similar role.
• Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
• Proficiency in DBT, SQL, and Python for data transformation and scripting.
• Experience with GitLab for version control and CI/CD.
• Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration abilities.
Preferred Qualifications:
• GCP certification(s) in data engineering or related areas.
• Experience working in agile development environments.
• Familiarity with other cloud platforms or data tools is a plus.