Senior Data Engineer (contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (contract) with a duration of over 6 months, offering on-site work. Key skills include Python, Azure Data Factory, Docker, and experience in hybrid cloud data architecture. Active Security Clearance is required.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
700
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Yes
-
πŸ“ - Location detailed
Ledbury, England, United Kingdom
-
🧠 - Skills detailed
#Data Security #Data Storage #"ETL (Extract #Transform #Load)" #Elasticsearch #PostgreSQL #DevOps #Cloud #Scripting #Azure cloud #Docker #ADF (Azure Data Factory) #Data Pipeline #Kafka (Apache Kafka) #Python #Compliance #Azure Data Factory #Data Engineering #Azure #Data Architecture #Batch #Data Integration #Automation #Security #Databases #Scala #Data Processing #GitHub #Kubernetes #Deployment #Storage #Data Integrity
Role description
Senior Data Engineer On-site Full time Methods Business and Digital Technology Limited Methods is a Β£100M+ IT Services Consultancy who has partnered with a range of central government departments and agencies to transform the way the public sector operates in the UK. Established over 30 years ago and UK-based, we apply our skills in transformation, delivery, and collaboration from across the Methods Group, to create end-to-end business and technical solutions that are people-centred, safe, and designed for the future. Our human touch sets us apart from other consultancies, system integrators and software houses - with people, technology, and data at the heart of who we are, we believe in creating value and sustainability through everything we do for our clients, staff, communities, and the planet. We support our clients in the success of their projects while working collaboratively to share skill sets and solve problems. At Methods we have fun while working hard; we are not afraid of making mistakes and learning from them. Predominantly focused on the public-sector, Methods is now building a significant private sector client portfolio. Methods was acquired by the Alten Group in early 2022. Requirements On-site, Full time. This role will require you to have ACTIVE Security Clearance, with a willingness to move to DV We are seeking a seasoned Senior Data Engineer to join our team. This role is essential for designing, building, and maintaining sophisticated data infrastructure systems that operate across both on-premises and Azure cloud environments. The position involves deploying and managing scalable data operations that support advanced analytics and data-driven decision-making, crucial for our organisational growth and innovation. Requirements Requirements β€’ Develop and Manage Data Pipelines: You will design, construct, and maintain efficient and reliable data pipelines using Python/Go/Azure Data Factory, capable of supporting both streaming and batch data processing across structured, semi-structured, and unstructured data in on-premises and Azure environments. β€’ Hybrid Cloud and Data Storage Solutions: Implement and manage data storage solutions leveraging both on-premises infrastructure and Azure, ensuring seamless data integration and accessibility across platforms. β€’ Containerisation and Orchestration: Utilise Docker for containerisation and Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. β€’ Workflow Automation: Employ tools such as Azure Data Factory to automate data flows and manage complex workflows within hybrid environments. β€’ Event Streaming Experience: Utilise event-driven technologies such as Kafka and NATS to handle real-time data streams effectively. β€’ Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. β€’ Database Development: Designing and developing PostgreSQL databases, ensuring high performance and availability across diverse deployment scenarios. Essential Skills And Experience β€’ Strong Python Skills: Expertise in Python for scripting and automating data processes across varied environments. β€’ Experience with ETL/ELT: Demonstrable experience in developing and optimising ETL or ELT workflows, particularly in hybrid (on-premises and Azure) environments. β€’ Expertise in Hybrid Cloud Data Architecture: Knowledge of integrating on-premises infrastructure with Azure cloud services. β€’ Containerisation and Orchestration Expertise: Solid experience with Docker, GitHub and Kubernetes in managing applications across both on-premises and cloud platforms. β€’ Proficiency in Workflow Automation Tools: Practical experience with Azure Data Factory in environments. β€’ Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka and NATS. β€’ Data Security Knowledge: Experience with implementing security practices and tools, including Keycloak, across multiple platforms. β€’ Search and Database Development Skills: Strong background in managing Elasticsearch and PostgreSQL in environments that span on-premises and cloud infrastructures. Your Impact In this role, you will empower business leaders to make informed decisions by delivering timely, accurate, and actionable data insights from a robust, hybrid infrastructure. Your expertise will drive the seamless integration of on-premises and cloud-based data solutions, enhancing both the flexibility and scalability of our data operations. You will champion the adoption of modern data architectures and tooling, and play a pivotal role in cultivating a data-driven culture within the organisation, mentoring team members, and advancing our engineering practices. Desirable Skills And Experience β€’ Certifications in Azure and Other Relevant Technologies: Certifications in cloud and on-premises technologies are highly beneficial and will strengthen your application. β€’ Experience in Data Engineering: A minimum of 5 years of experience in data engineering, with significant exposure to managing infrastructure in both on-premises and cloud settings. β€’ Some DevOps Engineering experience would be preferable