Hunter Bond

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6-month initial contract, paying an inside IR35 rate, requiring 3 days per week in London. Key skills include Python, Dagster, Apache Kafka, and AWS experience.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 11, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Inside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Leadership #Datasets #Cloud #Monitoring #Batch #AWS (Amazon Web Services) #Version Control #Code Reviews #Data Integrity #SQL (Structured Query Language) #Scala #Python #Data Integration #"ETL (Extract #Transform #Load)" #Web Services #Storage #Data Governance #Kafka (Apache Kafka) #Data Engineering #Data Pipeline #Logging #Apache Kafka
Role description
Senior Data Engineer β€’ Inside IR35 Contract β€’ 3 Days per Week in London β€’ 6-Month Initial Contract (2 Years+ Project Scope) β€’ Start Date : April 2026 We are working alongside a leading Commodities Trading Firm who are searching for a Senior Data Engineer to join the business. You'll be building pipelines to support their trading and analytics platforms! Project & Responsibilities β€’ Design and develop scalable Python-based data pipelines using Dagster to ingest, transform and integrate datasets into the enterprise data platform. β€’ Build ingestion frameworks to extract data from internal systems, trading platforms, APIs, and external providers, supporting both batch and real-time data flows. β€’ Implement streaming data integrations using Apache Kafka to enable event-driven processing and near real-time analytics. β€’ Land and manage curated datasets within AWS storage environments, ensuring data integrity, scalability and cost efficiency. β€’ Apply strong data governance practices including validation, lineage tracking, schema consistency and secure access controls across the data platform. β€’ Ensure production stability and operational reliability of data pipelines through monitoring, alerting, logging and proactive incident management. β€’ Break down complex data integration initiatives into structured technical deliverables, defining implementation strategies and mitigating delivery risks. β€’ Provide engineering leadership through code reviews, CI/CD best practices, mentoring engineers and promoting scalable, maintainable data platform solutions. Technical Skill Set Required β€’ Strong Python data engineering experience, including orchestration with Dagster, advanced SQL, and data modelling. β€’ Experience building streaming and event-driven pipelines using Apache Kafka and integrating heterogeneous data sources. β€’ Solid background operating cloud-native data platforms on Amazon Web Services, with CI/CD, version control and production operations experience. jevans@hunterbond.com for more details