

Senior Data Engineer (databricks)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Databricks) on a 6+ month contract in the Financial Services sector. Key skills include Databricks, Python, Spark, and Kafka. Experience with CI/CD tools and SQL is required; knowledge of GDPR is a plus.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
600
-
ποΈ - Date discovered
July 8, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reading, England, United Kingdom
-
π§ - Skills detailed
#Data Pipeline #Azure Event Hubs #Spark (Apache Spark) #Cloud #Data Engineering #GDPR (General Data Protection Regulation) #Python #GitHub #Kafka (Apache Kafka) #Big Data #Azure #Java #Scala #SQL (Structured Query Language) #Delta Lake #Azure DevOps #DevOps #PySpark #Databricks #"ETL (Extract #Transform #Load)"
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Contract Data Engineer
Financial Services | 6+ Months | Competitive Day Rate
My client, a global consultancy, is looking for an experienced Data Engineer to join on a contract basis to support major digital transformation projects with Tier 1 banks.
You'll help design and build scalable, cloud-based data solutions using Databricks, Python, Spark, and Kafka-working on both greenfield initiatives and enhancing high-traffic financial applications.
Key Skills & Experience:
β’ Strong hands-on experience with Databricks, Delta Lake, Spark Structured Streaming, and Unity Catalog
β’ Advanced Python/PySpark and big data pipeline development
β’ Familiar with event streaming tools (Kafka, Azure Event Hubs)
β’ Solid understanding of SQL, data modelling, and lakehouse architecture
β’ Experience deploying via CI/CD tools (e.g., Azure DevOps, GitHub Actions)
Nice to Have:
β’ Knowledge of Scala/Java
β’ Understanding of GDPR and handling sensitive data
This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance with some onsite presence required.