Harnham

Data Engineer (Snowflake, DBT)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Snowflake, DBT) on a 6-month contract in London, offering £500 - £550 per day. Key skills include Snowflake, SQL, dbt, and cloud platforms. Experience in data transformation and pipeline optimization is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
May 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Azure #GCP (Google Cloud Platform) #Data Ingestion #Snowflake #AWS (Amazon Web Services) #Scala #SQL (Structured Query Language) #Datasets #Cloud #Data Engineering #Leadership #dbt (data build tool) #Documentation #"ETL (Extract #Transform #Load)" #Data Architecture #Airflow #Data Warehouse #Data Pipeline
Role description
DATA ENGINEER 6-MONTH CONTRACT LONDON (1 day in office) £500 - £550 per day (Outside IR35) This position as a Snowflake Engineer offers the opportunity to join a leading travel company based near West London, currently undergoing a major cloud data transformation. THE COMPANY This established Pharma brand is known for its innovation and commitment to delivering seamless customer experiences through data. With a focus on modernising its data platforms, the company is investing in Snowflake and cloud-native tooling to better understand customer journeys, improve operations, and fuel growth. Contractors and perm hires we've placed here consistently highlight the collaborative culture, exciting technical challenges, and strong support from leadership. THE ROLE As a Data Engineer, you will play a crucial role in a data transformation program, focusing on optimising data pipelines and enabling cloud-driven insights. You will also be responsible for post-project documentation, ensuring clear communication with non-technical stakeholders. Your key responsibilities will include: • Designing and implementing a Snowflake-based data warehouse, ensuring scalability and efficiency. • • Developing and optimising ELT pipelines, leveraging Snowflake best practices for data ingestion and transformation. • Enhancing performance and query optimisation, ensuring cost-effective and high-performing workloads. • Working with dbt and Airflow to transform raw data into structured datasets for analytical consumption. • Creating clear documentation to communicate technical processes to non-technical teams. KEY SKILLS AND REQUIREMENTS To succeed in this role, you should have: • Strong commercial experience with Snowflake and its ecosystem. • Proficiency in SQL for data modelling and transformation within Snowflake. • Experience with dbt to develop scalable data pipelines. • Knowledge of ELT processes and best practices for cloud-based data architecture. • Hands-on experience with performance tuning and query optimisation in Snowflake. • Familiarity with cloud platforms (AWS/GCP/Azure) and their integration with Snowflake. HOW TO APPLY Please register your interest by sending your CV via the apply link on this page.