

Deloitte
AI Engineer – Trading Analytics & Data Platforms
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Engineer – Trading Analytics & Data Platforms, offering a 6-month contract in London (Hybrid). Requires strong expertise in Databricks, Spark, Python, and AI/ML solutions, with experience in trading environments preferred. Competitive day rate.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
City Of London, England, United Kingdom
-
🧠 - Skills detailed
#Data Access #Documentation #Data Ingestion #Observability #Data Pipeline #Spark (Apache Spark) #Terraform #Python #Delta Lake #Databases #Data Engineering #Databricks #PySpark #Datasets #ML (Machine Learning) #Data Science #AI (Artificial Intelligence) #MLflow #Azure #Automated Testing #SQL (Structured Query Language) #Statistics #BI (Business Intelligence) #Spark SQL #Scala #Forecasting #Regression #Microsoft Power BI
Role description
AI Engineer – Trading Analytics & Data Platforms
London (Hybrid – 3 days/week onsite)
6 Month Contract Role (Inside IR35 – Competitive Day Rate
About The Role
We are seeking an experienced AI Engineer with a strong data engineering foundation and a passion for solving real-world trading problems. You will work directly with traders and analysts, building AI-powered analytics and data solutions that deliver actionable insights from market pricing and fundamental data—often in near real-time.
This role sits at the intersection of AI, data engineering, and trading, with a focus on rapid prototyping, stakeholder collaboration, and production-grade delivery using modern Azure and Databricks ecosystems.
What You’ll Do
• Design and deliver AI-driven analytics for front-office use, including forecasting, seasonality, correlation, regression, and scenario modelling
• Build scalable, reusable data pipelines using Databricks (PySpark/Spark, Delta, Unity Catalog), optimizing performance, cost, and reliability
• Develop real-time and near real-time data solutions to support trading and reporting needs
• Translate complex trading problems into prototypes and MVPs, iterating rapidly based on feedback
• Partner closely with traders and analysts to understand requirements and communicate insights effectively
• Implement LLM and agent-based workflows (prompt engineering, orchestration, retrieval, tool usage, and guardrails)
• Perform statistical and econometric analysis on large-scale time-series datasets
• Productionize solutions with robust testing, observability, CI/CD pipelines, and documentation
• Enable reporting and data access via tools such as Power BI and similar platforms
What You’ll Bring
• Strong hands-on experience with Databricks and Spark (PySpark, SQL, Delta Lake, Unity Catalog)
• Proven data engineering expertise (data ingestion, modelling, orchestration, performance tuning)
• Solid foundation in statistics, econometrics, or data science—particularly with market time-series data
• Experience building AI/ML and LLM-based solutions (prompting, retrieval, agent workflows)
• Proficiency in Python and modern data/ML tooling (e.g., MLflow, feature stores, vector databases)
• Familiarity with CI/CD, Terraform, and production-grade engineering practices
• Excellent communication and stakeholder management skills, with the ability to work directly with front-office users
Nice to Have
• Experience in commodity or financial trading environments
• Understanding of market fundamentals, derivatives, P&L, contracts, and trading lifecycle
• Knowledge of market micro structure, supply-demand dynamics, and risk management concepts
Ways of Working
• Hybrid model with close collaboration alongside trading teams
• Fast-paced, iterative delivery: prototype quickly, refine with users, and scale to production
• Strong focus on engineering excellence, including automated testing, governance, and operational reliability
• Ability to work independently while leveraging support from a broader data and engineering team
Who You Are
• A pragmatic problem-solver who thrives in ambiguous, high-impact environments
• Comfortable bridging the gap between technical solutions and trading needs
• Driven to deliver measurable value through AI and data
If you’re excited about applying AI and advanced analytics in a trading environment and working directly with front-office stakeholders, we’d love to hear from you.
AI Engineer – Trading Analytics & Data Platforms
London (Hybrid – 3 days/week onsite)
6 Month Contract Role (Inside IR35 – Competitive Day Rate
About The Role
We are seeking an experienced AI Engineer with a strong data engineering foundation and a passion for solving real-world trading problems. You will work directly with traders and analysts, building AI-powered analytics and data solutions that deliver actionable insights from market pricing and fundamental data—often in near real-time.
This role sits at the intersection of AI, data engineering, and trading, with a focus on rapid prototyping, stakeholder collaboration, and production-grade delivery using modern Azure and Databricks ecosystems.
What You’ll Do
• Design and deliver AI-driven analytics for front-office use, including forecasting, seasonality, correlation, regression, and scenario modelling
• Build scalable, reusable data pipelines using Databricks (PySpark/Spark, Delta, Unity Catalog), optimizing performance, cost, and reliability
• Develop real-time and near real-time data solutions to support trading and reporting needs
• Translate complex trading problems into prototypes and MVPs, iterating rapidly based on feedback
• Partner closely with traders and analysts to understand requirements and communicate insights effectively
• Implement LLM and agent-based workflows (prompt engineering, orchestration, retrieval, tool usage, and guardrails)
• Perform statistical and econometric analysis on large-scale time-series datasets
• Productionize solutions with robust testing, observability, CI/CD pipelines, and documentation
• Enable reporting and data access via tools such as Power BI and similar platforms
What You’ll Bring
• Strong hands-on experience with Databricks and Spark (PySpark, SQL, Delta Lake, Unity Catalog)
• Proven data engineering expertise (data ingestion, modelling, orchestration, performance tuning)
• Solid foundation in statistics, econometrics, or data science—particularly with market time-series data
• Experience building AI/ML and LLM-based solutions (prompting, retrieval, agent workflows)
• Proficiency in Python and modern data/ML tooling (e.g., MLflow, feature stores, vector databases)
• Familiarity with CI/CD, Terraform, and production-grade engineering practices
• Excellent communication and stakeholder management skills, with the ability to work directly with front-office users
Nice to Have
• Experience in commodity or financial trading environments
• Understanding of market fundamentals, derivatives, P&L, contracts, and trading lifecycle
• Knowledge of market micro structure, supply-demand dynamics, and risk management concepts
Ways of Working
• Hybrid model with close collaboration alongside trading teams
• Fast-paced, iterative delivery: prototype quickly, refine with users, and scale to production
• Strong focus on engineering excellence, including automated testing, governance, and operational reliability
• Ability to work independently while leveraging support from a broader data and engineering team
Who You Are
• A pragmatic problem-solver who thrives in ambiguous, high-impact environments
• Comfortable bridging the gap between technical solutions and trading needs
• Driven to deliver measurable value through AI and data
If you’re excited about applying AI and advanced analytics in a trading environment and working directly with front-office stakeholders, we’d love to hear from you.






