Coltech

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract focused on Teradata to BigQuery migration within a FinTech environment. Key skills include strong SQL, Teradata and BigQuery experience, and data migration expertise. Contract length and pay rate are unspecified.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 15, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Texas, United States
-
🧠 - Skills detailed
#Storage #Teradata #Cloud #Data Pipeline #dbt (data build tool) #Teradata SQL #"ETL (Extract #Transform #Load)" #Migration #Dataflow #Python #AI (Artificial Intelligence) #ML (Machine Learning) #SQL (Structured Query Language) #Scala #DevOps #BigQuery #Data Engineering #Data Quality #Data Warehouse #GCP (Google Cloud Platform) #Data Architecture #Schema Design
Role description
Data Engineer – Teradata β†’ BigQuery Migration (FinTech) Contract | Data & AI Platform Programme We’re supporting a FinTech organisation (500–5,000 employees) delivering a major Data & AI platform modernisation, migrating from Teradata to Google BigQuery as part of a wider AI-readiness programme. This role sits within a delivery-focused data platform team, responsible for migrating, modernising, and optimising data workloads for a cloud-native, AI-ready future. What You’ll Be Working On β€’ Migrating data and workloads from Teradata to BigQuery β€’ Re-engineering and optimising Teradata SQL for BigQuery β€’ Building and maintaining cloud-native data pipelines β€’ Validating data quality, reconciliation, and performance post-migration β€’ Working closely with Cloud Architects, Data Architects, and DevOps β€’ Supporting analytics, ML, and future GenAI use cases Required Experience β€’ Strong commercial experience as a Data Engineer β€’ Hands-on Teradata experience (SQL, performance tuning, schema design) β€’ BigQuery experience in production environments β€’ Proven experience on data warehouse / platform migration projects β€’ Strong SQL skills and data modelling fundamentals β€’ Experience working in regulated or data-sensitive environments Nice to Have β€’ GCP services (Cloud Storage, Dataflow, Pub/Sub) β€’ Python or Scala for data engineering β€’ dbt or modern transformation tooling β€’ CI/CD for data pipelines β€’ Exposure to AI / ML data pipelines