

Advanced Resource Managers UK
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer based in Bristol on a 12-month contract, paying up to £79p/h (Outside IR35). Requires experience in government or regulated industries, expertise in Elastic Stack, Apache NiFi, and familiarity with ETL tools and containerization.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
632
-
🗓️ - Date
December 3, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater Bristol Area, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Docker #Elastic Stack #"ETL (Extract #Transform #Load)" #Ansible #Kafka (Apache Kafka) #Apache Kafka #Data Integrity #Data Quality #Data Pipeline #Prometheus #Data Architecture #NiFi (Apache NiFi) #Hadoop #Logstash #Kubernetes #ML (Machine Learning) #Spark (Apache Spark) #Scala #Infrastructure as Code (IaC) #Security #Debugging #Terraform #Compliance #Apache NiFi #Visualization #Disaster Recovery #Grafana #Data Processing #Monitoring #Data Governance #Elasticsearch #Data Ingestion #Replication
Role description
Senior Data Engineer
Bristol
12-Month Contract
Paying up to £79p/h (Outside IR35)
Role Overview: Our client, a large Aerospace company, is looking for an experienced Senior Data Engineer to assist with building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi
Key Responsibilities:
• Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi.
• Implement data ingestion, transformation, and integration processes, ensuring data quality and security.
• Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards.
• Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity.
• Develop robust data models to support analytics and reporting within secure environments.
• Perform troubleshooting, debugging, and performance tuning of data pipelines and the Elastic Stack.
• Build dashboards and visualizations in Kibana to enable data-driven decision-making.
• Ensure high availability and disaster recovery for data systems, implementing appropriate backup and replication strategies.
• Document data architecture, workflows, and security protocols to ensure smooth operational handover and audit readiness.
Required Skillset:
• Experience working in government, defence, or highly regulated industries with knowledge of relevant standards.
• Experience with additional data processing and ETL tools like Apache Kafka, Spark, or Hadoop
• Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
• Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure.
• Understanding of ML algorithms, their development and implementation
• Confidence in developing end-to-end solutions
• Experience with infrastructure as code, e.g. Terraform, Ansible
Senior Data Engineer
Bristol
12-Month Contract
Paying up to £79p/h (Outside IR35)
Role Overview: Our client, a large Aerospace company, is looking for an experienced Senior Data Engineer to assist with building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi
Key Responsibilities:
• Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi.
• Implement data ingestion, transformation, and integration processes, ensuring data quality and security.
• Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards.
• Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity.
• Develop robust data models to support analytics and reporting within secure environments.
• Perform troubleshooting, debugging, and performance tuning of data pipelines and the Elastic Stack.
• Build dashboards and visualizations in Kibana to enable data-driven decision-making.
• Ensure high availability and disaster recovery for data systems, implementing appropriate backup and replication strategies.
• Document data architecture, workflows, and security protocols to ensure smooth operational handover and audit readiness.
Required Skillset:
• Experience working in government, defence, or highly regulated industries with knowledge of relevant standards.
• Experience with additional data processing and ETL tools like Apache Kafka, Spark, or Hadoop
• Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
• Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure.
• Understanding of ML algorithms, their development and implementation
• Confidence in developing end-to-end solutions
• Experience with infrastructure as code, e.g. Terraform, Ansible






