Tenth Revolution Group

DataBricks Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DataBricks Data Engineer on a 3-month rolling contract, offering £400-£450 per day, fully remote. Key skills include Databricks, Azure Data Platform, Python, and SQL. Experience in data architecture and ETL processes is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 4, 2025
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Storage #Synapse #ML (Machine Learning) #Python #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Lake #Spark (Apache Spark) #SQL (Structured Query Language) #Data Ingestion #Azure #Delta Lake #Data Engineering #Databricks #Data Architecture #Data Processing
Role description
Data Engineer (Databricks & Azure) - 3-Month Rolling Contract Rate: £400-£450 per day Location: Remote IR35 Status: Outside IR35 Duration: Initial 3 months (rolling) About the Company Join a leading Databricks Partner delivering innovative data solutions for enterprise clients. You'll work on cutting-edge projects leveraging Databricks and Azure to transform data into actionable insights. About the Role We are seeking an experienced Data Engineer with strong expertise in Databricks and Azure to join our team on a 3-month rolling contract. This is a fully remote position, offering flexibility and autonomy while working on high-impact data engineering initiatives. Key Responsibilities • Design, develop, and optimize data pipelines using Databricks and Azure Data Services. • Implement best practices for data ingestion, transformation, and storage. • Collaborate with stakeholders to ensure data solutions meet business requirements. • Monitor and troubleshoot data workflows for performance and reliability. Essential Skills • Proven experience with Databricks (including Spark-based data processing). • Strong knowledge of Azure Data Platform (Data Lake, Synapse, etc.). • Proficiency in Python and SQL for data engineering tasks. • Understanding of data architecture and ETL processes. • Ability to work independently in a remote environment. Nice-to-Have • Experience with CI/CD pipelines for data solutions. • Familiarity with Delta Lake and ML pipelines. Start Date: ASAP Contract Type: Outside IR35 Apply Now: If you're a skilled Data Engineer looking for a flexible, remote opportunity with a Databricks Partner, we'd love to hear from you!