

GIOS Technology
Data Engineer – Observability & Telemetry
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer – Observability & Telemetry, offering a hybrid contract in Sheffield for 6 months at a competitive pay rate. Key skills include Kafka, Open Telemetry, Python, and data pipeline experience, particularly in banking.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
328
-
🗓️ - Date
February 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Sheffield, England, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Compliance #Data Engineering #Kafka (Apache Kafka) #Observability #Python #REST (Representational State Transfer) #Kubernetes #CLI (Command-Line Interface) #"ETL (Extract #Transform #Load)" #Security #Prometheus #JSON (JavaScript Object Notation) #Splunk
Role description
I am hiring for Data Engineer – Observability & Telemetry
Location: Sheffield - Hybrid / 3 days Per week in Office
• Hands-on experience building streaming data pipelines with Kafka, including producers, consumers, schema registry, and tools such as Kafka Connect, KSQL, or Kafka Streams.
• Proficiency with Open Shift and Kubernetes telemetry, including Open Telemetry, Prometheus, and related CLI tooling.
• Experience integrating telemetry into Splunk using HEC, Universal Forwarders, source types, CIM, dashboards, and alerting.
• Solid data engineering experience in Python (or similar languages) for ETL/ELT, enrichment, and validation.
• Understanding of event schemas (Avro, Protobuf, JSON), data contracts, and backward/forward compatibility.
• Familiarity with observability frameworks and the ability to drive maturity toward proactive, automated insights.
• Knowledge of security and compliance best practices for data pipelines, including secret management, RBAC, and encryption in transit and at rest.
Key Skills: Kafka / Open Telemetry / Open shift / Observability / Python / ETL / ELT / Banking
I am hiring for Data Engineer – Observability & Telemetry
Location: Sheffield - Hybrid / 3 days Per week in Office
• Hands-on experience building streaming data pipelines with Kafka, including producers, consumers, schema registry, and tools such as Kafka Connect, KSQL, or Kafka Streams.
• Proficiency with Open Shift and Kubernetes telemetry, including Open Telemetry, Prometheus, and related CLI tooling.
• Experience integrating telemetry into Splunk using HEC, Universal Forwarders, source types, CIM, dashboards, and alerting.
• Solid data engineering experience in Python (or similar languages) for ETL/ELT, enrichment, and validation.
• Understanding of event schemas (Avro, Protobuf, JSON), data contracts, and backward/forward compatibility.
• Familiarity with observability frameworks and the ability to drive maturity toward proactive, automated insights.
• Knowledge of security and compliance best practices for data pipelines, including secret management, RBAC, and encryption in transit and at rest.
Key Skills: Kafka / Open Telemetry / Open shift / Observability / Python / ETL / ELT / Banking






