

OKTO Technologies
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Lisburn, with a 3-month contract at £500.00 per day. Key skills include Azure mastery, Python, and experience with high-frequency data feeds. A minimum of 6 years in data engineering is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
500
-
🗓️ - Date
February 17, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lisburn
-
🧠 - Skills detailed
#Grafana #Spark (Apache Spark) #Azure DevOps #ML (Machine Learning) #Databricks #Data Lake #Scala #Observability #Schema Design #Airflow #BI (Business Intelligence) #Time Series #Data Architecture #Data Engineering #Synapse #DevOps #DAX #Azure Data Factory #Quality Assurance #Security #IoT (Internet of Things) #ADF (Azure Data Factory) #Snowflake #PySpark #Semantic Models #Data Lifecycle #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Kafka (Apache Kafka) #Microsoft Power BI #Spark SQL #Data Quality #Azure #Compliance #Azure Databricks #GitHub #Monitoring #Microsoft Azure #Cloud #SQL (Structured Query Language) #Python
Role description
Job Description:
Location: Lisburn
Contract Length: 3 Months (2-3 days per week)
Rate: £500.00 per day
Role:
OKTO is seeking a Senior Data Engineer to lead the architecture and implementation of a mission-critical digital operations system for several major projects. This role is foundational to our mission – connecting diverse system endpoints such as power ,airflow, heating, medical gas and fire alarms into a centralised, resilient cloud-based infrastructure.
You will be responsible for designing the end-to-end data lifecycle from high frequency time series ingestion to the creation of a “Master Dashboard” that ensures critical systems remain vital, accurate and trustworthy.
Responsibilities:
Data architecture & modelling: Design and implement a scalable architecture on Microsoft Azure to organise raw system data into curated, reporting-ready layers.
Operational system integration: Build robust pipelines to ingest and process high-frequency time-series and IoT-like data from critical infrastructure.
Cloud database production: Develop and manage the foundational cloud database using Azure Databricks, Synapse and Data Lake to serve as the single source of truth.
Real-time processing: Implement and optimise streaming workloads using Kafka, Event Hub and Spark Streaming to ensure low-latency data availability for life-critical monitoring.
Governance & compliance: Enforce strict data quality frameworks and maintain PII/PHI compliance, ensuring the integrity of sensitive healthcare and operational data.
Visualisation & reporting: Collaborate with stakeholders to design and deploy the “Master Dashboard” using Power BI, utilising DAX and semantic models for actionable insights.
Requirements:
Azure mastery: Deep expertise in the Azure Ecosystem, specifically Databricks, Azure Data Factory (ADF), Synapse and Unity Catalog.
Data engineering: Proficiency in Python, PySpark and Spark SQL for ETL/ELT pipeline development.
Time-series expertise: Proven experience handling real-time, high frequency data feeds (e.g. aviation, sensor or industrial data)
Schema design: Advanced knowledge of Star and Snowflake Schemas and Lakehouse architectures.
Quality assurance: Hands-on experience with data validation tools like Great Expectations and observability platforms like Grafana.
DevOps: Familiarity with CI/CD processes using Azure DevOps or GitHub Actions.
Preferred Qualifications:
6+ years of experience in data engineering with a focus on resilient, high availability systems.
Strong understanding of security protocols, including RBAC and Row-level Masking.
Ability to work closely with AI/ML teams to support predictive maintenance or digital data products.
Job Type: ContractContract length: 3 months
Pay: £500.00 per day
Experience:
Data Engineering: 6 years (required)
Work authorisation:
United Kingdom (required)
Location:
Lisburn (preferred)
Work Location: On the road
Job Description:
Location: Lisburn
Contract Length: 3 Months (2-3 days per week)
Rate: £500.00 per day
Role:
OKTO is seeking a Senior Data Engineer to lead the architecture and implementation of a mission-critical digital operations system for several major projects. This role is foundational to our mission – connecting diverse system endpoints such as power ,airflow, heating, medical gas and fire alarms into a centralised, resilient cloud-based infrastructure.
You will be responsible for designing the end-to-end data lifecycle from high frequency time series ingestion to the creation of a “Master Dashboard” that ensures critical systems remain vital, accurate and trustworthy.
Responsibilities:
Data architecture & modelling: Design and implement a scalable architecture on Microsoft Azure to organise raw system data into curated, reporting-ready layers.
Operational system integration: Build robust pipelines to ingest and process high-frequency time-series and IoT-like data from critical infrastructure.
Cloud database production: Develop and manage the foundational cloud database using Azure Databricks, Synapse and Data Lake to serve as the single source of truth.
Real-time processing: Implement and optimise streaming workloads using Kafka, Event Hub and Spark Streaming to ensure low-latency data availability for life-critical monitoring.
Governance & compliance: Enforce strict data quality frameworks and maintain PII/PHI compliance, ensuring the integrity of sensitive healthcare and operational data.
Visualisation & reporting: Collaborate with stakeholders to design and deploy the “Master Dashboard” using Power BI, utilising DAX and semantic models for actionable insights.
Requirements:
Azure mastery: Deep expertise in the Azure Ecosystem, specifically Databricks, Azure Data Factory (ADF), Synapse and Unity Catalog.
Data engineering: Proficiency in Python, PySpark and Spark SQL for ETL/ELT pipeline development.
Time-series expertise: Proven experience handling real-time, high frequency data feeds (e.g. aviation, sensor or industrial data)
Schema design: Advanced knowledge of Star and Snowflake Schemas and Lakehouse architectures.
Quality assurance: Hands-on experience with data validation tools like Great Expectations and observability platforms like Grafana.
DevOps: Familiarity with CI/CD processes using Azure DevOps or GitHub Actions.
Preferred Qualifications:
6+ years of experience in data engineering with a focus on resilient, high availability systems.
Strong understanding of security protocols, including RBAC and Row-level Masking.
Ability to work closely with AI/ML teams to support predictive maintenance or digital data products.
Job Type: ContractContract length: 3 months
Pay: £500.00 per day
Experience:
Data Engineering: 6 years (required)
Work authorisation:
United Kingdom (required)
Location:
Lisburn (preferred)
Work Location: On the road





