Tenth Revolution Group

Azure Data Engineer - £250PD Outside IR35 - Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with a contract length of unspecified duration, offering £250 per day outside IR35, fully remote. Key skills include Azure Databricks, Azure Synapse Analytics, SQL, and Python; 3+ years of relevant experience is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Microsoft Azure #Scala #Terraform #Data Architecture #Spark (Apache Spark) #DevOps #PySpark #ADF (Azure Data Factory) #Agile #Azure #Version Control #Databases #Azure Databricks #Azure SQL #Azure Data Factory #Python #Data Quality #Data Analysis #Databricks #BI (Business Intelligence) #Security #Scrum #Azure DevOps #GIT #Kafka (Apache Kafka) #Microsoft Power BI #Data Pipeline #Compliance #Data Governance #Infrastructure as Code (IaC) #SQL (Structured Query Language) #Data Science #Datasets #Data Modeling #Storage #Azure Synapse Analytics #Azure ADLS (Azure Data Lake Storage) #Azure Data Platforms #Data Lake #AI (Artificial Intelligence) #Spark SQL #Data Warehouse #Data Engineering #ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)" #Synapse
Role description
Data Engineer £250 Outside IR35 - RemoteJob Summary We are seeking a skilled Azure Data Engineer to design, build, and maintain scalable data solutions on Microsoft Azure. The ideal candidate has strong hands-on experience with Azure Databricks and Azure Synapse Analytics, and is passionate about transforming raw data into reliable, high-quality datasets that support analytics, reporting, and advanced data use cases. Key Responsibilities • Design, develop, and optimize end-to-end data pipelines using Azure services • Build and maintain scalable ETL/ELT workflows using Azure Databricks (PySpark/SQL) • Develop and manage data warehouses and analytics solutions using Azure Synapse Analytics • Ingest data from multiple sources (APIs, databases, files, streaming sources) into Azure data platforms • Implement data modeling, transformation, and validation to ensure data quality and reliability • Optimize performance, cost, and scalability of data pipelines and queries • Collaborate with data analysts, data scientists, and business stakeholders to deliver data solutions • Implement security, governance, and compliance best practices (RBAC, data masking, encryption) • Monitor, troubleshoot, and resolve pipeline and performance issues • Document data architecture, pipelines, and operational processes Required Qualifications • 3+ years of experience as a Data Engineer or in a similar role • Strong experience with Azure Databricks (PySpark, Spark SQL) • Hands-on experience with Azure Synapse Analytics (dedicated and/or serverless pools) • Solid understanding of data warehousing concepts and dimensional modeling • Proficiency in SQL and Python • Experience with Azure data services such as Azure Data Lake Storage (ADLS Gen2), Azure Data Factory, and Azure SQL • Familiarity with CI/CD pipelines and version control (Git, Azure DevOps) • Experience working in Agile/Scrum environments Preferred Qualifications • Azure certifications (e.g., Azure Data Engineer Associate) • Experience with streaming technologies (Event Hubs, Kafka, or Spark Structured Streaming) • Knowledge of data governance tools (Purview, Unity Catalog) • Experience with Power BI or other BI/analytics tools • Exposure to DevOps, Infrastructure as Code (ARM, Bicep, or Terraform) To apply for this role please submit your CV or contact Dillon Blackburn on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.