Fairfield Consultancy Services Limited (UK)

Senior Machine Learning Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Machine Learning Engineer on a hybrid contract in London, UK, lasting 6 months, with a pay rate of "£X/hour." Requires 10+ years of experience, hands-on AWS skills, and expertise in Kafka, Flink, and PyTorch.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Scala #ML (Machine Learning) #Kafka (Apache Kafka) #Infrastructure as Code (IaC) #MongoDB #S3 (Amazon Simple Storage Service) #Python #AWS SageMaker #SageMaker #Data Pipeline #AWS S3 (Amazon Simple Storage Service) #Deployment #Data Lake #Batch #Redis #AWS (Amazon Web Services) #PyTorch #Data Access #Monitoring
Role description
We are looking for a Senior ML Engineer to design, build, and operate scalable Real Time data pipelines and ML platforms on AWS. This is a contract role - hybrid role in London, UK ( 2-days a week) Experience-10+ yrs Key Responsibilities • Build and manage Real Time streaming pipelines using Kafka and Flink • Implement micro-batch processing (5-minute, hourly, daily) • Design and operate S3-based data pipelines and data lakes • Set up and manage Redis clusters for low-latency data access • Evaluate and implement MongoDB/Atlas where required • Build and operate MLOps pipelines using AWS SageMaker (training, deployment, monitoring) • Productionize ML models built in PyTorch • Ensure scalability, reliability, and performance of data and ML systems Required Skills • 2-3+ years hands-on AWS experience • Kafka, Flink (Real Time streaming pipelines) • AWS S3 data pipelines and data lake design • Real Time and micro-batch processing • Redis cluster setup and management • AWS SageMaker (training, deployment, MLOps) • PyTorch • Strong Python skills Nice to Have • MongoDB/MongoDB Atlas • CI/CD and Infrastructure as Code • Experience with large-scale distributed systems