Anson McCade

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in London (Hybrid) with a contract length of 6 months, offering up to £660/day. Key skills required include advanced Python, expert SQL, experience with ETL processes, and familiarity with AWS and data pipeline tools.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
660
-
🗓️ - Date
February 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Airflow #AWS (Amazon Web Services) #Data Architecture #Batch #Data Engineering #Data Science #GIT #ML (Machine Learning) #Python #Data Lake #"ETL (Extract #Transform #Load)" #Cloud #SQL (Structured Query Language) #Automated Testing #Data Pipeline
Role description
Data Engineer London (Hybrid - 2 days on-site when required) Up to £660/day Inside | 6 months A rare opportunity to join a large-scale, mission-critical operations environment that runs one of the most complex transport networks in the world. You’ll sit within a modern product-led data squad, building and operating industrial-grade data platforms and pipelines that directly power optimisation and machine-learning models used in real-time operational decision-making. Key skills & experience: • Strong experience in data engineering within complex, data-rich operational environments • Advanced Python for production systems (clean, tested, modular code) • Expert-level SQL for building performant, reliable data models and pipelines • Hands-on with modern data pipeline tooling and workflow orchestration (Dagster/Airflow or similar) • Experience designing and implementing robust ETL/ELT processes across multiple data sources • Solid experience with cloud platforms (AWS preferred) and modern data architectures (data lakes/warehouses) • Familiarity with CI/CD, Git, automated testing, and infrastructure-as-code practices • Exposure to large-scale operational or logistics domains What you’ll be doing: • Designing, building, and maintaining batch and streaming data pipelines that power optimisation and ML products • Developing robust Python-based data workflows and production-grade ingestion and transformation logic • Modelling and optimising data structures to support analytics, reporting, and real-time decision systems • Owning end-to-end data workflows: ingestion, transformation, orchestration, and integration into downstream applications • Working closely with Embedded Data Scientists, Engineers, and Product teams to ensure data is reliable, timely, and fit for purpose To hear more, get in touch with Connor Smyth at Anson McCade.