

Coltech
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract focused on Teradata to BigQuery migration within a FinTech environment. Key skills include strong SQL, Teradata and BigQuery experience, and data migration expertise. Contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 15, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Texas, United States
-
π§ - Skills detailed
#Storage #Teradata #Cloud #Data Pipeline #dbt (data build tool) #Teradata SQL #"ETL (Extract #Transform #Load)" #Migration #Dataflow #Python #AI (Artificial Intelligence) #ML (Machine Learning) #SQL (Structured Query Language) #Scala #DevOps #BigQuery #Data Engineering #Data Quality #Data Warehouse #GCP (Google Cloud Platform) #Data Architecture #Schema Design
Role description
Data Engineer β Teradata β BigQuery Migration (FinTech)
Contract | Data & AI Platform Programme
Weβre supporting a FinTech organisation (500β5,000 employees) delivering a major Data & AI platform modernisation, migrating from Teradata to Google BigQuery as part of a wider AI-readiness programme.
This role sits within a delivery-focused data platform team, responsible for migrating, modernising, and optimising data workloads for a cloud-native, AI-ready future.
What Youβll Be Working On
β’ Migrating data and workloads from Teradata to BigQuery
β’ Re-engineering and optimising Teradata SQL for BigQuery
β’ Building and maintaining cloud-native data pipelines
β’ Validating data quality, reconciliation, and performance post-migration
β’ Working closely with Cloud Architects, Data Architects, and DevOps
β’ Supporting analytics, ML, and future GenAI use cases
Required Experience
β’ Strong commercial experience as a Data Engineer
β’ Hands-on Teradata experience (SQL, performance tuning, schema design)
β’ BigQuery experience in production environments
β’ Proven experience on data warehouse / platform migration projects
β’ Strong SQL skills and data modelling fundamentals
β’ Experience working in regulated or data-sensitive environments
Nice to Have
β’ GCP services (Cloud Storage, Dataflow, Pub/Sub)
β’ Python or Scala for data engineering
β’ dbt or modern transformation tooling
β’ CI/CD for data pipelines
β’ Exposure to AI / ML data pipelines
Data Engineer β Teradata β BigQuery Migration (FinTech)
Contract | Data & AI Platform Programme
Weβre supporting a FinTech organisation (500β5,000 employees) delivering a major Data & AI platform modernisation, migrating from Teradata to Google BigQuery as part of a wider AI-readiness programme.
This role sits within a delivery-focused data platform team, responsible for migrating, modernising, and optimising data workloads for a cloud-native, AI-ready future.
What Youβll Be Working On
β’ Migrating data and workloads from Teradata to BigQuery
β’ Re-engineering and optimising Teradata SQL for BigQuery
β’ Building and maintaining cloud-native data pipelines
β’ Validating data quality, reconciliation, and performance post-migration
β’ Working closely with Cloud Architects, Data Architects, and DevOps
β’ Supporting analytics, ML, and future GenAI use cases
Required Experience
β’ Strong commercial experience as a Data Engineer
β’ Hands-on Teradata experience (SQL, performance tuning, schema design)
β’ BigQuery experience in production environments
β’ Proven experience on data warehouse / platform migration projects
β’ Strong SQL skills and data modelling fundamentals
β’ Experience working in regulated or data-sensitive environments
Nice to Have
β’ GCP services (Cloud Storage, Dataflow, Pub/Sub)
β’ Python or Scala for data engineering
β’ dbt or modern transformation tooling
β’ CI/CD for data pipelines
β’ Exposure to AI / ML data pipelines






