

twentyAI
Data Engineer - TWE44291
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Python) on a contract basis until year-end, with potential extension up to 2 years. Located in London, it offers £700 p/d, requiring strong Python, SQL, and data modeling skills, along with Azure experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
700
-
🗓️ - Date
January 31, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Storage #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Databricks #Python #Forecasting #Plotly #Azure Blob Storage #Azure #Data Quality #AI (Artificial Intelligence) #Data Engineering #Kafka (Apache Kafka) #S3 (Amazon Simple Storage Service) #Streamlit #Airflow
Role description
Senior Data Engineer (Python) – Contract
Commodity Trading | Inside IR35 | London (3 days onsite)
A leading energy trading organisation is hiring a senior Python Data Engineer to support front-office analytics and forecasting during a major data platform transition.
This is a hands-on contract role for someone who can build robust pipelines, work directly with traders and analysts, and raise the bar across a mixed-ability data team.
Key details
• Contract: Initially until year end (potentially up to 2 years)
• Location: London – 3 days per week onsite
• Start: Immediate
• Rate: £700 p/d via umbrella
What you’ll be doing
• Building and maintaining Python + SQL data pipelines
• Integrating multi-source data (forecasts, wind/solar output, LNG movements, streaming feeds)
• Supporting energy fundamentals and forecasting dashboards
• Working closely with front-office stakeholders
• Improving data quality, modelling, and governance
• Contributing to best practices and mentoring across the team
• Helping transition the platform toward Databricks
Must-have skills (non-negotiable)
• Strong Python and SQL
• Solid data modelling and ETL fundamentals
• Experience with Azure Blob Storage or S3
• Comfortable discussing data with front-office users
• Able to mentor and uplift less experienced engineers
• Active use of AI coding tools (eg Copilot, Claude)
Nice-to-have experience
• Databricks (being implemented imminently)
• Orchestration: Dagster preferred (Airflow / Prefect acceptable)
• Real-time or near-real-time data (Kafka / Azure Event Hub)
• Python dashboarding: Dash, Plotly, Streamlit
• Energy or commodities trading exposure (helpful, not required)
Senior Data Engineer (Python) – Contract
Commodity Trading | Inside IR35 | London (3 days onsite)
A leading energy trading organisation is hiring a senior Python Data Engineer to support front-office analytics and forecasting during a major data platform transition.
This is a hands-on contract role for someone who can build robust pipelines, work directly with traders and analysts, and raise the bar across a mixed-ability data team.
Key details
• Contract: Initially until year end (potentially up to 2 years)
• Location: London – 3 days per week onsite
• Start: Immediate
• Rate: £700 p/d via umbrella
What you’ll be doing
• Building and maintaining Python + SQL data pipelines
• Integrating multi-source data (forecasts, wind/solar output, LNG movements, streaming feeds)
• Supporting energy fundamentals and forecasting dashboards
• Working closely with front-office stakeholders
• Improving data quality, modelling, and governance
• Contributing to best practices and mentoring across the team
• Helping transition the platform toward Databricks
Must-have skills (non-negotiable)
• Strong Python and SQL
• Solid data modelling and ETL fundamentals
• Experience with Azure Blob Storage or S3
• Comfortable discussing data with front-office users
• Able to mentor and uplift less experienced engineers
• Active use of AI coding tools (eg Copilot, Claude)
Nice-to-have experience
• Databricks (being implemented imminently)
• Orchestration: Dagster preferred (Airflow / Prefect acceptable)
• Real-time or near-real-time data (Kafka / Azure Event Hub)
• Python dashboarding: Dash, Plotly, Streamlit
• Energy or commodities trading exposure (helpful, not required)






