Harrington Starr

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 3-month contract, paying £400-£500 per day outside IR35, based in London (hybrid). Requires strong Snowflake expertise, advanced Python skills, and experience with ETL/ELT pipelines and data platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
January 31, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Compliance #Data Pipeline #Data Architecture #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Python #Security #Data Lakehouse #Data Quality #BI (Business Intelligence) #Data Engineering #Snowflake #Airflow #Scala #Data Lake
Role description
Senior Data Engineer Contract: 3 months (rolling) Start: ASAP Location: London - Hybrid £500 (£400- £500) per day OUTSIDE IR35 We are seeking an experienced Senior Data Engineer to help design and deliver a modern Snowflake-based data platform. This is a hands-on contract role focused on building core architecture, establishing robust data pipelines, and delivering the first iteration of a scalable Data Lakehouse to support BI and compliance analytics. This opportunity suits engineers who can move quickly, take ownership, and deliver production-ready solutions with minimal ramp-up time. Key Responsibilities • Define and implement the core Snowflake data architecture and data model • Build and maintain ETL/ELT pipelines from internal systems and third-party data sources • Design and establish CI/CD workflows for data engineering and analytics operations • Deliver the initial version of a Data Lakehouse to support business intelligence and regulatory/compliance use cases • Work closely with technical and business stakeholders to ensure scalable, reliable data solutions • Apply best practices across performance, security, and data quality Required Skills & Experience • Strong hands-on experience with Snowflake / Data Warehousing / Lakehouse architectures • (please clearly quantify years of experience and/or number of relevant projects) • Advanced Python development skills for data engineering • Experience with orchestration and transformation frameworks such as Airflow, dbt, or similar • Proven ability to design and deliver data platforms end to end • Comfortable working autonomously in fast-paced environments Interviews can be scheduled at short notice, and candidates available immediately or on short notice will be prioritised.