twentyAI

Data Engineer - TWE43338

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focused on Front Office Commodity Trading in London, with a contract length of unspecified duration, offering up to £700 p/d Inside IR35. Key skills include Python, PySpark, and Databricks, with 8+ years in large-scale data platforms required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
700
-
🗓️ - Date
October 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Delta Lake #SQL (Structured Query Language) #Terraform #DevOps #PySpark #Data Pipeline #Data Science #ML (Machine Learning) #Cloud #Kafka (Apache Kafka) #Data Warehouse #Automation #AI (Artificial Intelligence) #Databricks #Spark (Apache Spark) #Data Engineering #Apache Spark #Scala #Observability #Python #MLflow #Datasets
Role description
Senior Data Engineer – Front Office Commodity Trading Key Details: London, 3 days in the office (min.), Up to £700 p/d Inside IR35 via Umbrella Key Skills: Python, PySpark, Databricks, Data Science/ML Join a global leader in commodity trading and help build the next-generation analytics platform that powers front-office decisions worldwide. We’re looking for a Senior Data Engineer who thrives on scale, performance, and real-time data challenges. You’ll design and deliver cloud-native data pipelines and ML/AI infrastructure supporting trading, risk, and analytics across multiple commodities. What You’ll Do • Build and optimize scalable data pipelines and Delta Lake architectures in Databricks on AWS • Partner with data scientists and quants to deliver ML-ready datasets and production-grade AI pipelines • Implement CI/CD, MLOps, and observability best practices using MLflow and Terraform • Shape the Front Office data warehouse, ensuring speed, reliability, and accuracy • Collaborate globally across tech and trading to drive innovation and automation What You Bring • 8+ years building large-scale distributed data platforms • Deep expertise in Databricks, Apache Spark, PySpark, Delta Lake, and Unity Catalog • Strong Python, SQL, AWS, and Terraform skills • Experience with DevOps/MLOps, and ideally exposure to streaming (Kafka) and real-time analytics • Proven delivery in high-performance or trading environments