

Templeton and Partners - Tech Recruitment
Artificial Intelligence Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Artificial Intelligence Engineer with a contract length of "unknown" and a pay rate of "unknown." Requires hands-on Databricks and Spark expertise, data engineering skills, and experience in commodity or financial trading. Hybrid work location.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 22, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#PySpark #Data Lineage #SQL (Structured Query Language) #Scala #Data Science #Data Pipeline #Statistics #Data Engineering #Time Series #Spark SQL #Databricks #Terraform #Automated Testing #Security #MLflow #Regression #AI (Artificial Intelligence) #Observability #Documentation #Forecasting #Spark (Apache Spark) #Datasets
Role description
AI Engineer (Trading Analytics & Data Engineering)
About the role
We're hiring an AI Engineer with a strong data engineering foundation and excellent communication skills—ideally with commodity or financial trading experience. You'll partner with traders and trading analysts to rapidly build AI‑powered analytics over market pricing and fundamentals data, using Databricks and Spark to deliver value at speed.
What you'll do
Design and ship AI‑driven analytics for front‑office use (seasonality, correlation, regression, forecasting, scenario modelling).
Build reusable and scalable data pipelines in Databricks (PySpark/Spark, Delta/Unity Catalog), optimizing cost, reliability, and performance.
Run statistical/econometric analyses on large datasets (e.g., market & fundamental time series data).
Collaborate directly with traders/analysts—translate ambiguous questions into shippable solutions; communicate insights clearly.
Implement LLM/agentic workflows: prompt engineering, LangGraph orchestration, MCP integrations, tool calling, retrieval, and guardrails.
Productionize solutions with testing, observability, versioning, and documentation.
What you'll bring
Hands‑on Databricks + Spark expertise (PySpark, SQL, Delta, Unity Catalog).
Proven data engineering skills (ingestion, modelling, orchestration, performance tuning).
Strong statistics/economics/data science fundamentals for market time‑series.
Experience building LLM solutions (prompting, retrieval, agent flows; LangGraph, MCP) and integrating with trading data/services.
Experience with CI/CD, Terraform, MLflow/feature stores, vector DBs, and governance (PII handling, data lineage).
Excellent stakeholder skills; able to work on‑desk with traders/analysts and deliver fast.
Nice to have
Background in commodity or financial trading.
Familiarity with market microstructure, supply‑demand fundamentals, risk management.
Ways of working
Hybrid; high‑touch collaboration with trading teams.
Bias to prototype fast, iterate with users, and harden to production.
• Maintain operational stability of production pipelines using secure, modern CI/CD engineering practices — with automated testing, quality gates, and built‑in reliability across development, security, and operations.
AI Engineer (Trading Analytics & Data Engineering)
About the role
We're hiring an AI Engineer with a strong data engineering foundation and excellent communication skills—ideally with commodity or financial trading experience. You'll partner with traders and trading analysts to rapidly build AI‑powered analytics over market pricing and fundamentals data, using Databricks and Spark to deliver value at speed.
What you'll do
Design and ship AI‑driven analytics for front‑office use (seasonality, correlation, regression, forecasting, scenario modelling).
Build reusable and scalable data pipelines in Databricks (PySpark/Spark, Delta/Unity Catalog), optimizing cost, reliability, and performance.
Run statistical/econometric analyses on large datasets (e.g., market & fundamental time series data).
Collaborate directly with traders/analysts—translate ambiguous questions into shippable solutions; communicate insights clearly.
Implement LLM/agentic workflows: prompt engineering, LangGraph orchestration, MCP integrations, tool calling, retrieval, and guardrails.
Productionize solutions with testing, observability, versioning, and documentation.
What you'll bring
Hands‑on Databricks + Spark expertise (PySpark, SQL, Delta, Unity Catalog).
Proven data engineering skills (ingestion, modelling, orchestration, performance tuning).
Strong statistics/economics/data science fundamentals for market time‑series.
Experience building LLM solutions (prompting, retrieval, agent flows; LangGraph, MCP) and integrating with trading data/services.
Experience with CI/CD, Terraform, MLflow/feature stores, vector DBs, and governance (PII handling, data lineage).
Excellent stakeholder skills; able to work on‑desk with traders/analysts and deliver fast.
Nice to have
Background in commodity or financial trading.
Familiarity with market microstructure, supply‑demand fundamentals, risk management.
Ways of working
Hybrid; high‑touch collaboration with trading teams.
Bias to prototype fast, iterate with users, and harden to production.
• Maintain operational stability of production pipelines using secure, modern CI/CD engineering practices — with automated testing, quality gates, and built‑in reliability across development, security, and operations.






