

AGIT Consultancy
Senior Data Engineer (GCP)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (GCP) with a contract length of "Unknown", offering a pay rate of "Unknown". The position is hybrid and requires expertise in SQL, Python, Azure services, and data governance. Experience in manufacturing or FMCG is desirable.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Scala #ADF (Azure Data Factory) #Data Governance #Data Privacy #JSON (JavaScript Object Notation) #Storage #Azure Data Factory #Data Architecture #dbt (data build tool) #SQL (Structured Query Language) #Data Quality #REST (Representational State Transfer) #Tableau #Disaster Recovery #Version Control #Azure #Compliance #Airflow #DevOps #Documentation #IAM (Identity and Access Management) #Anomaly Detection #"ETL (Extract #Transform #Load)" #Data Transformations #XML (eXtensible Markup Language) #Data Ingestion #GCP (Google Cloud Platform) #Data Lake #Databricks #Agile #GDPR (General Data Protection Regulation) #Microsoft Power BI #Synapse #Cloud #Leadership #Data Catalog #Logging #BI (Business Intelligence) #Data Engineering #Classification #ADLS (Azure Data Lake Storage) #Data Encryption #Batch #Data Lakehouse #Data Lineage #Deployment #Observability #Terraform #Data Pipeline #Python #HADR (High Availability Disaster Recovery) #Kafka (Apache Kafka) #Automation #Snowflake #Security #Delta Lake
Role description
Job Ref :- 843 | Senior Data Engineer (GCP) ( Hybrid | GCP Hybrid | Outside IR35)
Apply via LinkedIn or Email your CV to:- HR@AGITCONSULTANCY.CO.UK
About the Role
We are looking for an experienced Senior Data Engineer to lead the design, development, and optimisation of our cloud-based data infrastructure. This role will take technical ownership of key data engineering initiatives, mentor junior team members, and ensure the security, scalability, and reliability of our data platform. Working closely with engineering, analytics, and security stakeholders, you will play a critical role in enabling data-driven decision-making across the business.
Key Duties, Responsibilities & Accountabilities
Data Architecture & Engineering Leadership
• Lead the design and implementation of robust, secure, and scalable data pipelines using ELT/ETL patterns.
• Develop and promote best practices around modular pipeline design, orchestration, and data transformations.
• Serve as a technical mentor to other data engineers and support peer reviews and architecture discussions.
Advanced Pipeline Design & Optimisation
• Architect real-time, batch, and micro-batch data workflows for business-critical use cases.
• Leverage modern orchestration tools (e.g. Airflow, Azure Data Factory, dbt Cloud) for scheduling, observability, and lineage tracking.
• Monitor and fine-tune performance of pipelines, queries, and storage systems to ensure efficiency and cost control.
Cloud Infrastructure & Platform Ownership
• Take ownership of data platform infrastructure across Azure (Data Lake, Synapse, Event Hubs, etc.) and/or hybrid environments.
• Implement CI/CD workflows, DevOps practices, and infrastructure-as-code (Terraform, Bicep) for data pipeline deployments.
• Drive initiatives for high availability, disaster recovery, and fault-tolerant design.
Data Governance, Security & Privacy
• Embed data privacy and security-by-design into all engineering workflows, including data masking, encryption, and access controls.
• Collaborate with Information Security teams to ensure compliance with GDPR and internal data handling policies.
• Lead efforts in data lineage, classification, and audit logging for sensitive data assets.
• Integration & Interoperability
• Design and maintain integrations with ERP systems (e.g., Infor M3, ION), planning tools, and external data sources via APIs or SFTP.
• Define reusable data ingestion frameworks for structured and semi-structured formats (JSON, XML, CSV, Avro, Parquet).
Data Quality, Testing & Observability
• Implement data validation, testing, and anomaly detection frameworks using tools like dbt tests, Great Expectations, or custom solutions.
• Ensure clear data lineage and documentation from source to consumption.
• Set up observability dashboards and alerts to ensure data pipeline reliability and transparency.
Collaboration & Delivery
• Partner with analysts, scientists, and stakeholders to translate business needs into scalable data solutions.
• Participate in project planning, estimation, and agile delivery across the data roadmap.
• Document architecture decisions, data dictionaries, and engineering standards.
Knowledge, Skills and Experience
Essential Experience & Skills
• 7+ years in data engineering, with experience in leading technical delivery in enterprise environments.
• Expert-level knowledge of SQL and Python for data transformation and automation.
• Strong experience with Azure data services (Data Factory, Synapse, Event Hub, ADLS).
• Proven experience building and optimising data lakehouse architectures (e.g., Delta Lake, Databricks, Snowflake).
• Hands-on experience with orchestration tools (Airflow, ADF) and data modelling (dimensional/star/snowflake).
• Familiarity with REST/SOAP APIs and event streaming platforms (e.g., Kafka, Azure Event Hub).
• Awareness of security, data protection, and compliance requirements (GDPR, data encryption, IAM).
• Experience with CI/CD pipelines, version control, and DevOps principles.
Desirable Experience & Skills
• Experience working with manufacturing or FMCG systems and data (ERP, MES, TPM).
• Familiarity with Microsoft Purview or data cataloging solutions.
• Exposure to Power BI, Tableau or other reporting tools.
• Certifications in Azure Data Engineering (e.g., DP-203), dbt, or cloud architecture.
Equal Opportunity Statement
• AGIT Consultancy is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, veteran status, disability, sexual orientation/gender identity, or any other characteristic protected by applicable law.
•
Job Ref :- 843 | Senior Data Engineer (GCP) ( Hybrid | GCP Hybrid | Outside IR35)
Apply via LinkedIn or Email your CV to:- HR@AGITCONSULTANCY.CO.UK
About the Role
We are looking for an experienced Senior Data Engineer to lead the design, development, and optimisation of our cloud-based data infrastructure. This role will take technical ownership of key data engineering initiatives, mentor junior team members, and ensure the security, scalability, and reliability of our data platform. Working closely with engineering, analytics, and security stakeholders, you will play a critical role in enabling data-driven decision-making across the business.
Key Duties, Responsibilities & Accountabilities
Data Architecture & Engineering Leadership
• Lead the design and implementation of robust, secure, and scalable data pipelines using ELT/ETL patterns.
• Develop and promote best practices around modular pipeline design, orchestration, and data transformations.
• Serve as a technical mentor to other data engineers and support peer reviews and architecture discussions.
Advanced Pipeline Design & Optimisation
• Architect real-time, batch, and micro-batch data workflows for business-critical use cases.
• Leverage modern orchestration tools (e.g. Airflow, Azure Data Factory, dbt Cloud) for scheduling, observability, and lineage tracking.
• Monitor and fine-tune performance of pipelines, queries, and storage systems to ensure efficiency and cost control.
Cloud Infrastructure & Platform Ownership
• Take ownership of data platform infrastructure across Azure (Data Lake, Synapse, Event Hubs, etc.) and/or hybrid environments.
• Implement CI/CD workflows, DevOps practices, and infrastructure-as-code (Terraform, Bicep) for data pipeline deployments.
• Drive initiatives for high availability, disaster recovery, and fault-tolerant design.
Data Governance, Security & Privacy
• Embed data privacy and security-by-design into all engineering workflows, including data masking, encryption, and access controls.
• Collaborate with Information Security teams to ensure compliance with GDPR and internal data handling policies.
• Lead efforts in data lineage, classification, and audit logging for sensitive data assets.
• Integration & Interoperability
• Design and maintain integrations with ERP systems (e.g., Infor M3, ION), planning tools, and external data sources via APIs or SFTP.
• Define reusable data ingestion frameworks for structured and semi-structured formats (JSON, XML, CSV, Avro, Parquet).
Data Quality, Testing & Observability
• Implement data validation, testing, and anomaly detection frameworks using tools like dbt tests, Great Expectations, or custom solutions.
• Ensure clear data lineage and documentation from source to consumption.
• Set up observability dashboards and alerts to ensure data pipeline reliability and transparency.
Collaboration & Delivery
• Partner with analysts, scientists, and stakeholders to translate business needs into scalable data solutions.
• Participate in project planning, estimation, and agile delivery across the data roadmap.
• Document architecture decisions, data dictionaries, and engineering standards.
Knowledge, Skills and Experience
Essential Experience & Skills
• 7+ years in data engineering, with experience in leading technical delivery in enterprise environments.
• Expert-level knowledge of SQL and Python for data transformation and automation.
• Strong experience with Azure data services (Data Factory, Synapse, Event Hub, ADLS).
• Proven experience building and optimising data lakehouse architectures (e.g., Delta Lake, Databricks, Snowflake).
• Hands-on experience with orchestration tools (Airflow, ADF) and data modelling (dimensional/star/snowflake).
• Familiarity with REST/SOAP APIs and event streaming platforms (e.g., Kafka, Azure Event Hub).
• Awareness of security, data protection, and compliance requirements (GDPR, data encryption, IAM).
• Experience with CI/CD pipelines, version control, and DevOps principles.
Desirable Experience & Skills
• Experience working with manufacturing or FMCG systems and data (ERP, MES, TPM).
• Familiarity with Microsoft Purview or data cataloging solutions.
• Exposure to Power BI, Tableau or other reporting tools.
• Certifications in Azure Data Engineering (e.g., DP-203), dbt, or cloud architecture.
Equal Opportunity Statement
• AGIT Consultancy is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, veteran status, disability, sexual orientation/gender identity, or any other characteristic protected by applicable law.
•






