

Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer on a 6-month contract, offering £550-£600 per day, based in London. Key skills include Azure Databricks, Python, SQL, and experience with big data tools. Strong cloud security knowledge is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date discovered
August 9, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Microsoft Azure #SQL (Structured Query Language) #Strategy #Hadoop #Security #Synapse #Kafka (Apache Kafka) #Python #ADF (Azure Data Factory) #Terraform #Automation #Cloud #Forecasting #Data Pipeline #Version Control #Data Lake #Big Data #Azure Data Factory #Data Access #Scala #Spark (Apache Spark) #Azure Databricks #Data Lakehouse #Databricks #Azure #Infrastructure as Code (IaC) #Migration #Deployment
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
DATABRICKS ENGINEER
6-MONTH CONTRACT
£550-£600 PER DAY
This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration, and the chance to tackle complex cloud-native engineering challenges.
THE COMPANY
This is a leading organisation within the renewable energy sector, dedicated to sustainable innovation and data-driven operations. The business is undergoing rapid digital transformation, investing in cloud-based technologies to optimise performance, forecasting, and environmental impact. With operations across multiple regions, their data initiatives play a key role in supporting clean energy production, distribution, and strategy.
THE ROLE
You'll join a collaborative engineering team focused on building scalable, secure, and efficient data platforms on Microsoft Azure. Your work will directly support migration initiatives, analytics enablement, and platform reliability. You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems.
Your responsibilities will include:
• Designing and implementing scalable data lakehouse architectures using Databricks on Azure.
• Building efficient ETL/ELT pipelines for structured and unstructured data.
• Working with stakeholders to ensure high-quality, accessible data delivery.
• Optimising SQL workloads and data flows for analytics performance.
• Automating infrastructure deployment using Terraform and maintaining CI/CD practices.
• Supporting secure and performant data access via cloud-based networking.
KEY SKILLS AND REQUIREMENTS
• Strong experience with Azure Databricks in production environments.
• Background with Azure Data Factory, Azure Functions, and Synapse Analytics.
• Proficient in Python and advanced SQL, including query tuning and optimisation.
• Hands-on experience with big data tools such as Spark, Hadoop, and Kafka.
• Familiarity with CI/CD pipelines, version control, and deployment automation.
• Experience using Infrastructure as Code tools like Terraform.
• Solid understanding of Azure-based networking and cloud security principles.
HOW TO APPLY
Please register your interest by sending your CV via the apply link on this page.