

Data Engineer - ETL & Streaming
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - ETL & Streaming on a 6-month contract, fully remote, paying up to £380/day. Key skills include Python, Scala, Apache Spark, and OpenTelemetry. Experience with real-time streaming and machine learning is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
380
-
🗓️ - Date discovered
August 14, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Big Data #Scala #Anomaly Detection #Data Engineering #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Modeling #DevOps #Java #Python #Spark (Apache Spark) #Data Framework #Kafka (Apache Kafka) #Data Manipulation #ML (Machine Learning) #Django #Observability #Data Processing #Apache Spark #Programming #Jupyter #Flask #Datasets #Databases #Data Science
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Engineer – ETL & Streaming - Remote - 6 month Contract
Overview:
My Client is seeking a highly skilled Data Engineer with strong expertise in ETL pipelines, real-time streaming, and telemetry data processing. This role involves building and optimising data pipelines, instrumenting systems for observability, and applying machine learning to streaming telemetry for actionable insights.
Core Responsibilities:
Programming & Statistical Computing
• Expert in Python for data manipulation, statistical analysis, and ML model development.
• Experience with Scala or Java (preferred for big data frameworks such as Apache Spark, Kafka, and backend systems).
OpenTelemetry & Observability
• In-depth understanding of OpenTelemetry architecture and OTLP (OpenTelemetry Protocol).
• Experience in instrumenting and collecting telemetry from:
• Virtual Machines
• Middleware (e.g., IIS, Apache)
• Hypervisors
• Databases
• Skilled in structuring and normalizing telemetry data for downstream analysis.
Streaming Data & ML Integration
• Proven experience with OLTP and real-time streaming data pipelines.
• Ability to correlate telemetry across multiple layers (infrastructure, applications, databases).
• Application of ML models to streaming data for actionable insights.
Telemetry Data Modeling
• Design and implement structured telemetry schemas.
• Strong grasp of distributed tracing, metrics, and log correlation.
ML-Driven Observability
• Apply ML techniques (e.g., anomaly detection, predictive analytics) to observability datasets.
• Develop feedback loops to trigger automated responses based on telemetry insights.
Collaboration & Communication
• Effectively communicate technical telemetry and ML insights to cross-functional teams.
• Partner with DevOps, SRE, and platform teams to enhance observability maturity.
Mandatory Skills
• Flask
• Apache Spark
• Python
• Nginx
• Django
• Jupyter Notebook
• Scala
• Apache Spark (Scala)
• Python – Data Science
Fully Remote | Up to £380/day (Outside IR35, Umbrella)
If this sounds like you, please apply directly.