

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Azure Databricks, offering a 6-month contract in London (hybrid). Requires 12+ years in data pipelines, 4+ years with Azure services, SQL, Python, and experience in data security and architecture.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
August 27, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Databases #Big Data #Data Management #Data Security #Informatica #Cloud #Azure DevOps #Security #Databricks #IICS (Informatica Intelligent Cloud Services) #Storage #ADF (Azure Data Factory) #Data Architecture #SQL Server #Azure Data Factory #Azure Databricks #Agile #DevOps #SQL (Structured Query Language) #Microsoft Azure #Data Quality #Data Engineering #Jira #Azure SQL Database #Synapse #Data Modeling #Oracle #Data Integration #Azure #Python #Delta Lake #Azure Synapse Analytics #Scrum #Data Lake #Azure ADLS (Azure Data Lake Storage) #ADLS (Azure Data Lake Storage) #Data Ingestion #Azure SQL #MDM (Master Data Management)
Role description
Azure Databricks Engineer
London (Hybrid working 3 days onsite)
6 months initial contract length
We're looking for an experienced Azure Databricks Engineer to design and build cutting-edge data solutions on cloud-native platforms. If you have a passion for modern data architecture and a track record of success, we want to hear from you.
What You'll Bring:
β’ 12+ years of experience developing data ingestion, processing, and analytical pipelines for big data and relational databases.
β’ 4+ years of hands-on experience with core Microsoft Azure services, including:
β’ Azure Data Lake Storage (ADLS)
β’ Azure Databricks
β’ Azure Data Factory
β’ Azure Synapse Analytics
β’ Azure SQL Database
β’ Extensive experience with SQL, Python, and data integration patterns using tools like Informatica IICS and Databricks notebooks.
β’ Strong knowledge of Delta Lake, data warehousing technologies, and cloud platforms, with a strong preference for Azure.
β’ Experience with on-premise and cloud databases (e.g., Oracle, SQL Server).
β’ Familiarity with Agile methodologies (Scrum, SAFe) and tools (Jira, Azure DevOps).
β’ Experience with large-scale data ingestion, cloud process flows, data quality, and master data management.
Our Ideal Candidate
β’ A deep understanding of data security challenges and solutions, particularly with Databricks.
β’ In-depth knowledge of data delivery, architectural principles, data modeling concepts, and the entire data production process.
β’ Excellent verbal and written communication skills, with a collaborative, teamwork-oriented mindset.
Azure Databricks Engineer
London (Hybrid working 3 days onsite)
6 months initial contract length
We're looking for an experienced Azure Databricks Engineer to design and build cutting-edge data solutions on cloud-native platforms. If you have a passion for modern data architecture and a track record of success, we want to hear from you.
What You'll Bring:
β’ 12+ years of experience developing data ingestion, processing, and analytical pipelines for big data and relational databases.
β’ 4+ years of hands-on experience with core Microsoft Azure services, including:
β’ Azure Data Lake Storage (ADLS)
β’ Azure Databricks
β’ Azure Data Factory
β’ Azure Synapse Analytics
β’ Azure SQL Database
β’ Extensive experience with SQL, Python, and data integration patterns using tools like Informatica IICS and Databricks notebooks.
β’ Strong knowledge of Delta Lake, data warehousing technologies, and cloud platforms, with a strong preference for Azure.
β’ Experience with on-premise and cloud databases (e.g., Oracle, SQL Server).
β’ Familiarity with Agile methodologies (Scrum, SAFe) and tools (Jira, Azure DevOps).
β’ Experience with large-scale data ingestion, cloud process flows, data quality, and master data management.
Our Ideal Candidate
β’ A deep understanding of data security challenges and solutions, particularly with Databricks.
β’ In-depth knowledge of data delivery, architectural principles, data modeling concepts, and the entire data production process.
β’ Excellent verbal and written communication skills, with a collaborative, teamwork-oriented mindset.