iO Associates - UK/EU

Platform Excellence (Contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a fully remote contract position focused on Platform Excellence within Databricks, requiring advanced Python, strong Databricks experience, and solid data engineering fundamentals. Contract length and pay rate are unspecified.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
700
-
🗓️ - Date
April 23, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Distributed Computing #Data Engineering #Data Science #MLflow #Scala #Spark (Apache Spark) #ML (Machine Learning) #Python #Libraries #Databricks #Monitoring #Deep Learning #Documentation #API (Application Programming Interface)
Role description
I'm working with my client on a UK-based, fully remote contract opportunity focused on Platform Excellence within Databricks. This role is all about promoting best-practice use of Databricks across the organisation by building reusable patterns, Python SDKs/libraries, and pragmatic frameworks that teams can adopt quickly and consistently. The opportunity You'll sit at the intersection of software engineering, data engineering, and MLOps, helping teams standardise how they build, run, and govern workloads on Databricks - improving developer experience, reliability, scalability, and performance. Key responsibilities • Design and build reusable Python packages/SDKs to accelerate delivery across teams • Create best-practice frameworks and reference implementations (patterns, templates, guardrails) • Drive improvements across the Databricks estate (e.g., jobs/workflows, cluster optimisation, Unity Catalog, MLflow) • Partner closely with stakeholders to understand pain points and turn them into practical platform solutions • Produce clear documentation so engineering and data science teams can self-serve and adopt consistently Priority skills / experience (what my client is looking for) • Advanced Python (packaging, modular design, dependency management) • Proven experience building reusable libraries / internal SDKs • Strong Databricks experience (jobs/workflows, MLflow, Unity Catalog, optimisation) • Strong software & platform engineering mindset (API/system design, scalability, performance) • Strong MLOps knowledge (lifecycle management, reproducibility, monitoring) • Solid data engineering fundamentals (Spark, distributed computing, pipelines) • Strong documentation and developer enablement focus Nice to have • Machine Learning / Deep Learning / GenAI exposure What makes someone successful here • Excellent communication and stakeholder engagement • Self-starter who works well with minimal direction • Pragmatic delivery mindset (balancing best practice with business reality) • Patient, coaching-oriented approach and strong problem-solving skills