

AGIT Consultancy
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on GCP. Contract length is unspecified, with a pay rate of "Outside IR35." Key skills include ETL/ELT, data warehousing, and cloud-native solutions. Certifications in GCP are required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#SAP #Programming #BI (Business Intelligence) #Compliance #Big Data #Data Modeling #Scrum #Agile #DevOps #IAM (Identity and Access Management) #Data Quality #Data Science #Data Engineering #Dataflow #Data Migration #Teradata #IP (Internet Protocol) #Data Lake #Visualization #Security #ML (Machine Learning) #Jenkins #Data Processing #Batch #Logging #Monitoring #Oracle #Strategy #Storage #Microsoft Power BI #Scala #Data Governance #AI (Artificial Intelligence) #dbt (data build tool) #Data Warehouse #BigQuery #Leadership #Looker #Data Pipeline #Airflow #Data Architecture #Talend #"ETL (Extract #Transform #Load)" #Migration #Terraform #Cloud #Python #Data Catalog #GCP (Google Cloud Platform) #SQL (Structured Query Language)
Role description
Job Ref :- 842 | Sr. Data Engineer | GCP | Hybrid | Outside IR35)
Apply via LinkedIn or Email CV :-HR@AGITCONSULTANCY.CO.UK
Lead Data Architect / Senior Data Engineer (10+ Years Experience)
Professional Summary
Results-driven Lead Data Architect / Senior Data Engineer with 10+ years of experience in designing, building, and optimizing scalable data platforms and pipelines on Google Cloud Platform (GCP).
Combines strong architectural expertise with hands-on engineering capabilities, specializing in data warehousing, big data processing, streaming & batch pipelines, and cloud-native solutions. Proven ability to lead teams, define enterprise data strategies, and deliver end-to-end data solutions, including large-scale data migrations from legacy systems (SAP, Oracle, Teradata) to BigQuery.
Experienced in enabling data governance, advanced analytics, and AI/ML-driven solutions, while ensuring high standards in performance, scalability, and security.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Core Competencies
• Enterprise Data Architecture & Strategy
• ETL/ELT Pipeline Design & Optimization
• Data Warehousing & Data Modeling (OLAP/OLTP)
• Big Data & Distributed Data Processing
• Streaming & Batch Data Pipelines
• Data Governance, Quality & Security
• Cloud Data Engineering (GCP)
• Data Migration & Modernization
• Technical Leadership & Team Mentorship
• Agile Delivery & CI/CD Practices
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Technical Skills
Programming & Query Languages
• Python
• SQL
GCP Services
• BigQuery
• Cloud Storage (GCS)
• Dataflow
• Dataproc
• Pub/Sub
• Cloud Composer (Airflow)
• Cloud Functions
• Data Fusion
• Dataplex
• Data Catalog
• Cloud SQL
• Cloud Logging & Monitoring
Tools & Frameworks
• DBT
• Talend
• Terraform
• Jenkins
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Professional Experience Highlights
Architecture & Solution Design
• Define and maintain enterprise data architecture principles, standards, and governance frameworks
• Design and implement scalable data lake and data warehouse architectures on GCP
• Architect end-to-end data solutions supporting analytics, reporting, and data science use cases
• Lead solution design, POCs, and technology evaluations
Data Engineering & Pipeline Development
• Build and optimize ETL/ELT pipelines using BigQuery, Dataflow, Pub/Sub, and GCS
• Develop streaming and batch data pipelines for real-time and large-scale processing
• Implement workflow orchestration using Cloud Composer (Airflow)
• Write efficient, scalable data processing code in Python and SQL
Data Migration & Modernization
• Lead end-to-end migration programs from legacy platforms (SAP, Oracle, Teradata) to GCP
• Design cost-efficient and high-performance migration strategies to BigQuery
• Modernize data platforms using lakehouse and cloud-native architectures
Data Governance & Quality
• Implement data governance frameworks ensuring data quality, lineage, and compliance
• Establish data validation, monitoring, and auditing mechanisms
Leadership & Stakeholder Engagement
• Lead and mentor data engineering teams, providing architectural and technical guidance
• Collaborate with business stakeholders to translate requirements into scalable solutions
• Drive client discussions, workshops, and technical presentations
• Contribute to solution decks, IP development, and innovation initiatives
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Innovation & Advanced Capabilities
• Lead initiatives in Generative AI and AI/ML use cases on GCP (Vertex AI)
• Explore and implement new GCP services to enhance data platform capabilities
• Support data science and advanced analytics workloads
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Additional Skills
• AI/ML & Generative AI (Vertex AI)
• IAM, Security & Access Control
• DevOps & CI/CD (Terraform, Jenkins)
• Data Visualization (Looker, Power BI)
• Agile/Scrum methodologies
• Project & Program Management
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Certifications
• Google Cloud Professional Cloud Architect
• Google Cloud Professional Data Engineer
• Google Cloud Professional Database Engineer
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Key Strengths
• Strong balance of architecture + hands-on engineering expertise
• Proven ability to deliver scalable, production-grade data solutions
• Excellent communication, leadership, and stakeholder management skills
• Deep expertise in GCP data ecosystem and modern data platforms
• Continuous learner with focus on innovation and emerging technologies
Job Ref :- 842 | Sr. Data Engineer | GCP | Hybrid | Outside IR35)
Apply via LinkedIn or Email CV :-HR@AGITCONSULTANCY.CO.UK
Lead Data Architect / Senior Data Engineer (10+ Years Experience)
Professional Summary
Results-driven Lead Data Architect / Senior Data Engineer with 10+ years of experience in designing, building, and optimizing scalable data platforms and pipelines on Google Cloud Platform (GCP).
Combines strong architectural expertise with hands-on engineering capabilities, specializing in data warehousing, big data processing, streaming & batch pipelines, and cloud-native solutions. Proven ability to lead teams, define enterprise data strategies, and deliver end-to-end data solutions, including large-scale data migrations from legacy systems (SAP, Oracle, Teradata) to BigQuery.
Experienced in enabling data governance, advanced analytics, and AI/ML-driven solutions, while ensuring high standards in performance, scalability, and security.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Core Competencies
• Enterprise Data Architecture & Strategy
• ETL/ELT Pipeline Design & Optimization
• Data Warehousing & Data Modeling (OLAP/OLTP)
• Big Data & Distributed Data Processing
• Streaming & Batch Data Pipelines
• Data Governance, Quality & Security
• Cloud Data Engineering (GCP)
• Data Migration & Modernization
• Technical Leadership & Team Mentorship
• Agile Delivery & CI/CD Practices
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Technical Skills
Programming & Query Languages
• Python
• SQL
GCP Services
• BigQuery
• Cloud Storage (GCS)
• Dataflow
• Dataproc
• Pub/Sub
• Cloud Composer (Airflow)
• Cloud Functions
• Data Fusion
• Dataplex
• Data Catalog
• Cloud SQL
• Cloud Logging & Monitoring
Tools & Frameworks
• DBT
• Talend
• Terraform
• Jenkins
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Professional Experience Highlights
Architecture & Solution Design
• Define and maintain enterprise data architecture principles, standards, and governance frameworks
• Design and implement scalable data lake and data warehouse architectures on GCP
• Architect end-to-end data solutions supporting analytics, reporting, and data science use cases
• Lead solution design, POCs, and technology evaluations
Data Engineering & Pipeline Development
• Build and optimize ETL/ELT pipelines using BigQuery, Dataflow, Pub/Sub, and GCS
• Develop streaming and batch data pipelines for real-time and large-scale processing
• Implement workflow orchestration using Cloud Composer (Airflow)
• Write efficient, scalable data processing code in Python and SQL
Data Migration & Modernization
• Lead end-to-end migration programs from legacy platforms (SAP, Oracle, Teradata) to GCP
• Design cost-efficient and high-performance migration strategies to BigQuery
• Modernize data platforms using lakehouse and cloud-native architectures
Data Governance & Quality
• Implement data governance frameworks ensuring data quality, lineage, and compliance
• Establish data validation, monitoring, and auditing mechanisms
Leadership & Stakeholder Engagement
• Lead and mentor data engineering teams, providing architectural and technical guidance
• Collaborate with business stakeholders to translate requirements into scalable solutions
• Drive client discussions, workshops, and technical presentations
• Contribute to solution decks, IP development, and innovation initiatives
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Innovation & Advanced Capabilities
• Lead initiatives in Generative AI and AI/ML use cases on GCP (Vertex AI)
• Explore and implement new GCP services to enhance data platform capabilities
• Support data science and advanced analytics workloads
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Additional Skills
• AI/ML & Generative AI (Vertex AI)
• IAM, Security & Access Control
• DevOps & CI/CD (Terraform, Jenkins)
• Data Visualization (Looker, Power BI)
• Agile/Scrum methodologies
• Project & Program Management
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Certifications
• Google Cloud Professional Cloud Architect
• Google Cloud Professional Data Engineer
• Google Cloud Professional Database Engineer
\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Key Strengths
• Strong balance of architecture + hands-on engineering expertise
• Proven ability to deliver scalable, production-grade data solutions
• Excellent communication, leadership, and stakeholder management skills
• Deep expertise in GCP data ecosystem and modern data platforms
• Continuous learner with focus on innovation and emerging technologies






