

EXL
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 10+ years of experience, focusing on DBT, Cloud Composer, Python, and Terraform. It's a flexible hybrid contract in Edinburgh, UK, with an inside IR35 pay structure, starting ASAP.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
March 25, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Inside IR35
-
π - Security
Unknown
-
π - Location detailed
Edinburgh, Scotland, United Kingdom
-
π§ - Skills detailed
#Deployment #dbt (data build tool) #Data Quality #GCP (Google Cloud Platform) #Scala #AI (Artificial Intelligence) #BigQuery #Leadership #Python #Data Processing #Infrastructure as Code (IaC) #Data Pipeline #Cloud #Migration #Terraform #"ETL (Extract #Transform #Load)" #Data Engineering #Apache Airflow #Data Governance #Airflow #DevOps #Automation #Storage
Role description
EXL (NASDAQ: EXLS) is a global data and artificial intelligence ("AI") company that offers services and solutions to reinvent client business models, drive better outcomes and unlock growth with speed. EXL harnesses the power of data, AI, and deep industry knowledge to transform businesses, including the worldβs leading corporations in industries including insurance, healthcare, banking and financial services, media and retail, among others. EXL was founded in 1999 with the core values of innovation, collaboration, excellence, integrity and respect.
We are headquartered in New York and have more than 60,000 employees spanning six continents. For more information, visit www.exlservice.com.
Role: GCP Data Engineer (DBT, Cloud Composer, Python, Terraform)
BU/Segment: Banking / Analytics
Location: Edinburgh, United Kingdom (Flexible hybrid working)
Employment Type: Umbrella Contract (Inside IR35) to start ASAP
Experience: 10+ years
We are seeking an experienced GCP Data Engineer with strong expertise in DBT, Cloud Composer, Python, and Terraform. This role will focus on migrating legacy data platforms and regulatory use cases (e.g., risk, finance, RWA) to GCP, while actively contributing to design and development. The ideal candidate combines strong technical depth and will work with a team of engineers to deliver scalable, high-quality data solutions.
As part of your duties, you will be responsible for:
β’ Lead the design, development, and deployment of data pipelines on GCP.
β’ Drive migration of legacy data platforms and use cases to GCP, ensuring minimal disruption and optimal performance.
β’ Build and manage data transformation workflows using DBT.
β’ Orchestrate pipelines using Cloud Composer (Apache Airflow).
β’ Develop robust, reusable code in Python for data processing and automation.
β’ Implement Infrastructure as Code (IaC) using Terraform for scalable and repeatable deployments.
β’ Collaborate with business and technology stakeholders to understand requirements and translate them into technical solutions.
β’ Ensure data quality, governance, and best practices across all implementations.
β’ Provide technical leadership, mentor team members, and guide design decisions.
Qualifications and experience we consider to be essential for the role:
β’ Strong hands-on experience with Google Cloud Platform (BigQuery, Cloud Storage, etc.).
β’ Proven experience in DBT for data transformation.
β’ Expertise in Cloud Composer / Apache Airflow for workflow orchestration.
β’ Advanced proficiency in Python.
β’ Solid experience with Terraform for infrastructure provisioning.
β’ Demonstrated experience in migrating legacy systems (on-prem or other cloud) to GCP.
β’ Strong understanding of data warehousing concepts and ETL/ELT frameworks.
β’ Experience in leading teams and managing end-to-end delivery.
β’ Preferred Qualifications
β’ Experience in large-scale data transformation programs.
β’ Familiarity with CI/CD pipelines and DevOps practices.
β’ Exposure to data governance and regulatory environments (e.g., banking/financial services).
β’ Strong problem-solving and stakeholder management skills.
β’ Soft Skills
β’ Strong leadership and communication skills.
β’ Ability to work in a fast-paced, collaborative environment.
β’ Proactive mindset with a focus on ownership and delivery.
To be considered for this role, you must already be eligible to work in the United Kingdom.
EXL (NASDAQ: EXLS) is a global data and artificial intelligence ("AI") company that offers services and solutions to reinvent client business models, drive better outcomes and unlock growth with speed. EXL harnesses the power of data, AI, and deep industry knowledge to transform businesses, including the worldβs leading corporations in industries including insurance, healthcare, banking and financial services, media and retail, among others. EXL was founded in 1999 with the core values of innovation, collaboration, excellence, integrity and respect.
We are headquartered in New York and have more than 60,000 employees spanning six continents. For more information, visit www.exlservice.com.
Role: GCP Data Engineer (DBT, Cloud Composer, Python, Terraform)
BU/Segment: Banking / Analytics
Location: Edinburgh, United Kingdom (Flexible hybrid working)
Employment Type: Umbrella Contract (Inside IR35) to start ASAP
Experience: 10+ years
We are seeking an experienced GCP Data Engineer with strong expertise in DBT, Cloud Composer, Python, and Terraform. This role will focus on migrating legacy data platforms and regulatory use cases (e.g., risk, finance, RWA) to GCP, while actively contributing to design and development. The ideal candidate combines strong technical depth and will work with a team of engineers to deliver scalable, high-quality data solutions.
As part of your duties, you will be responsible for:
β’ Lead the design, development, and deployment of data pipelines on GCP.
β’ Drive migration of legacy data platforms and use cases to GCP, ensuring minimal disruption and optimal performance.
β’ Build and manage data transformation workflows using DBT.
β’ Orchestrate pipelines using Cloud Composer (Apache Airflow).
β’ Develop robust, reusable code in Python for data processing and automation.
β’ Implement Infrastructure as Code (IaC) using Terraform for scalable and repeatable deployments.
β’ Collaborate with business and technology stakeholders to understand requirements and translate them into technical solutions.
β’ Ensure data quality, governance, and best practices across all implementations.
β’ Provide technical leadership, mentor team members, and guide design decisions.
Qualifications and experience we consider to be essential for the role:
β’ Strong hands-on experience with Google Cloud Platform (BigQuery, Cloud Storage, etc.).
β’ Proven experience in DBT for data transformation.
β’ Expertise in Cloud Composer / Apache Airflow for workflow orchestration.
β’ Advanced proficiency in Python.
β’ Solid experience with Terraform for infrastructure provisioning.
β’ Demonstrated experience in migrating legacy systems (on-prem or other cloud) to GCP.
β’ Strong understanding of data warehousing concepts and ETL/ELT frameworks.
β’ Experience in leading teams and managing end-to-end delivery.
β’ Preferred Qualifications
β’ Experience in large-scale data transformation programs.
β’ Familiarity with CI/CD pipelines and DevOps practices.
β’ Exposure to data governance and regulatory environments (e.g., banking/financial services).
β’ Strong problem-solving and stakeholder management skills.
β’ Soft Skills
β’ Strong leadership and communication skills.
β’ Ability to work in a fast-paced, collaborative environment.
β’ Proactive mindset with a focus on ownership and delivery.
To be considered for this role, you must already be eligible to work in the United Kingdom.






