Data Engineer (Azure, Databricks & Snowflake)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Azure, Databricks & Snowflake) on a 6–12 month contract, remote (UK/EU based), offering a competitive day rate. Requires strong experience in Databricks, Snowflake, Azure services, SQL, and Python.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
September 18, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Databricks #Data Quality #Data Engineering #Scala #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Storage #DevOps #Delta Lake #SQL (Structured Query Language) #Python #Data Lake #Azure #Snowflake #ML (Machine Learning) #Security #Synapse #Cloud #Data Architecture #Azure cloud #Data Pipeline #Azure Databricks
Role description
Data Engineer – Azure / Databricks / Snowflake 📍 Remote (UK/EU based) | ⏳ 6–12 months | 💷 Competitive Day Rate We are supporting a leading client in Financial Services who are at the early stages of building a new enterprise Data Lake on Azure Cloud. To deliver this programme, we’re looking for experienced Contract Data Engineers with proven expertise in Databricks and Snowflake, available for immediate engagement. The role: • Design, build, and optimise data pipelines and ETL processes on Azure Databricks • Implement scalable, high-performance data models in Snowflake • Work closely with architects and stakeholders to define data architecture, storage, and integration patterns • Contribute to the build of a secure, production-ready Data Lake from the ground up • Support best practices across data quality, governance, and CI/CD You will bring: • Strong commercial experience as a Data Engineer • End-to-end project exposure, ideally from greenfield/early-stage Data Lake builds • Deep hands-on knowledge of Databricks (Spark, Delta Lake, ML pipelines) and Snowflake (warehousing, optimisation, security) • Solid understanding of Azure Cloud services (Data Factory, Synapse, Storage, DevOps) • Strong SQL, Python, and ETL/ELT background Contract details: • Start: ASAP • Duration: 6–12 months • Location: Remote (must be UK/EU based) • Day Rate: Competitive If you’ve successfully delivered Databricks + Snowflake projects on Azure and want to play a key role in building a new data platform from the ground up, we’d love to hear from you.