USM Business Systems

AI Engineer (Trading Analytics & Data Engineering)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Engineer (Trading Analytics & Data Engineering) on a hybrid contract, offering a competitive pay rate. Key skills include Databricks, Spark, data engineering, and experience in commodity or financial trading is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Data Science #PySpark #Datasets #Regression #Statistics #Data Engineering #Spark SQL #Terraform #Databricks #Time Series #Forecasting #Documentation #Spark (Apache Spark) #SQL (Structured Query Language) #Data Pipeline #Observability #Security #Scala #Data Lineage #Automated Testing #MLflow #AI (Artificial Intelligence)
Role description
About the role We're hiring an AI Engineer with a strong data engineering foundation and excellent communication skills—ideally with commodity or financial trading experience. You'll partner with traders and trading analysts to rapidly build AI powered analytics over market pricing and fundamentals data, using Databricks and Spark to deliver value at speed. What you'll do · Design and ship AI driven analytics for front office use (seasonality, correlation, regression, forecasting, scenario modelling). · Build reusable and scalable data pipelines in Databricks (PySpark/Spark, Delta/Unity Catalog), optimizing cost, reliability, and performance. · Run statistical/econometric analyses on large datasets (e.g., market & fundamental time series data). · Collaborate directly with traders/analysts—translate ambiguous questions into shippable solutions; communicate insights clearly. · Implement LLM/agentic workflows: prompt engineering, LangGraph orchestration, MCP integrations, tool calling, retrieval, and guardrails. · Productionize solutions with testing, observability, versioning, and documentation. What you'll bring · Hands on Databricks + Spark expertise (PySpark, SQL, Delta, Unity Catalog). · Proven data engineering skills (ingestion, modelling, orchestration, performance tuning). · Strong statistics/economics/data science fundamentals for market time series. · Experience building LLM solutions (prompting, retrieval, agent flows; LangGraph, MCP) and integrating with trading data/services. · Experience with CI/CD, Terraform, MLflow/feature stores, vector DBs, and governance (PII handling, data lineage). · Excellent stakeholder skills; able to work on desk with traders/analysts and deliver fast. Nice to have · Background in commodity or financial trading. · Familiarity with market microstructure, supply demand fundamentals, risk management. Ways of working Hybrid; high touch collaboration with trading teams. Bias to prototype fast, iterate with users, and harden to production. Maintain operational stability of production pipelines using secure, modern CI/CD engineering practices — with automated testing, quality gates, and built in reliability across development, security, and operations.