

twentyAI
Python Data Engineer - TWE44291
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Data Engineer, contract length unspecified, with a pay rate of "competitive". Work location is "remote". Requires strong Python, SQL, data modeling, ETL skills, and experience with Azure Blob Storage. Databricks familiarity is beneficial.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Storage #Python #Forecasting #Data Quality #Data Engineering #Databricks #Airflow #Data Pipeline #"ETL (Extract #Transform #Load)" #S3 (Amazon Simple Storage Service) #Streamlit #Azure Blob Storage #AI (Artificial Intelligence) #Plotly #SQL (Structured Query Language) #Azure #Kafka (Apache Kafka)
Role description
A leading energy trading organisation requires a senior Python Data Engineer to support front office analytics and forecasting during a major data platform transition.
This is a highly technical engineering project for someone who can build robust data pipelines, models and products to enahnce trading decisions and critical business process. You'd engage directly with traders / analysts and the team are rolling out Databricks so experience with this would be highly beneficial.
What you’ll be doing
• Building and maintaining Python + SQL data pipelines
• Integrating multi-source data (forecasts, wind/solar output, LNG movements, streaming feeds)
• Supporting energy fundamentals and forecasting dashboards
• Working closely with front office stakeholders
• Improving data quality, modelling, and governance
• Contributing to best practices and mentoring across the team
• Helping transition the platform toward Databricks
Must-have skills (non-negotiable)
• Strong Python and SQL
• Solid data modelling and ETL fundamentals
• Experience with Azure Blob Storage or S3
• Comfortable discussing data with front-office users
• Able to mentor and uplift less experienced engineers
• Active use of AI coding tools (eg Copilot, Claude)
Nice-to-have experience
• Databricks (being implemented imminently)
• Orchestration: Dagster preferred (Airflow / Prefect acceptable)
• Real-time or near-real-time data (Kafka / Azure Event Hub)
• Python dashboarding: Dash, Plotly, Streamlit
• Energy or commodities trading exposure (helpful, not required)
A leading energy trading organisation requires a senior Python Data Engineer to support front office analytics and forecasting during a major data platform transition.
This is a highly technical engineering project for someone who can build robust data pipelines, models and products to enahnce trading decisions and critical business process. You'd engage directly with traders / analysts and the team are rolling out Databricks so experience with this would be highly beneficial.
What you’ll be doing
• Building and maintaining Python + SQL data pipelines
• Integrating multi-source data (forecasts, wind/solar output, LNG movements, streaming feeds)
• Supporting energy fundamentals and forecasting dashboards
• Working closely with front office stakeholders
• Improving data quality, modelling, and governance
• Contributing to best practices and mentoring across the team
• Helping transition the platform toward Databricks
Must-have skills (non-negotiable)
• Strong Python and SQL
• Solid data modelling and ETL fundamentals
• Experience with Azure Blob Storage or S3
• Comfortable discussing data with front-office users
• Able to mentor and uplift less experienced engineers
• Active use of AI coding tools (eg Copilot, Claude)
Nice-to-have experience
• Databricks (being implemented imminently)
• Orchestration: Dagster preferred (Airflow / Prefect acceptable)
• Real-time or near-real-time data (Kafka / Azure Event Hub)
• Python dashboarding: Dash, Plotly, Streamlit
• Energy or commodities trading exposure (helpful, not required)






