

Queen Square Recruitment
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Databricks) on a 6-month contract in London, UK, paying £400 per day. Requires 6+ years in data engineering, strong Python and SQL skills, and proven Databricks pipeline experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
400
-
🗓️ - Date
February 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#GIT #Delta Lake #Python #Data Quality #AutoScaling #Databricks #Monitoring #Data Engineering #Datasets #DevOps #SQL (Structured Query Language) #DataOps #Security #Azure #Classification #Observability #Documentation #Scala #Batch #Cloud #Data Science
Role description
Senior Data Engineer (Databricks) – Contract
📍 London, UK (2/3 days on site)
💷 £400 per day (Inside IR35)
🕒 6-month contract (Extendable)
Travel: Occasional travel to Dublin may be required
Our client, a top global organization, is seeking a Senior Data Engineer to design, build, and operate production-grade data products across customer, commercial, financial, sales, and enterprise data domains. This role is strongly focused on Databricks-based engineering, delivering trusted, governed, and scalable datasets that support reporting, analytics, and advanced use cases.
Key Responsibilities
• Design, build, and maintain Databricks pipelines using Delta Lake and Delta Live Tables (DLT)
• Implement medallion architectures (Bronze / Silver / Gold)
• Deliver reusable, well-documented, and discoverable data products
• Ensure pipelines meet non-functional requirements (freshness, latency, scalability, reliability, and cost)
• Own Databricks assets including Jobs/Workflows, notebooks, SQL, and Unity Catalog objects
• Apply Git-based DevOps practices (branching, PRs, CI/CD) and Databricks Asset Bundles (DABs)
• Implement monitoring, alerting, incident response, and root cause analysis
• Support production operations with runbooks and operational standards
• Enforce governance and security using Unity Catalog (lineage, classification, ACLs, row/column-level security)
• Define and maintain data quality rules, expectations, and SLOs
• Support investigation and resolution of data anomalies and production issues
• Partner with Product Owners, Data Engineering Manager, Data Scientists, and business stakeholders to translate business requirements into functional and non-functional data solutions
Essential Skills & Experience
• 6+ years’ experience in data engineering or advanced analytics engineering
• Strong hands-on expertise in Python and SQL
• Proven experience building production pipelines in Databricks
• Solid understanding of data modelling, performance tuning, and cost optimisation
• High attention to detail with strong documentation and process-design skills
Desirable Experience
• Strong Databricks Lakehouse expertise (Delta Lake, DLT, batch & streaming pipelines)
• Lakehouse monitoring, data quality, and observability
• Unity Catalog governance and security in regulated environments
• Databricks DevOps/DataOps with CI/CD and environment promotion
• Performance and cost optimisation (autoscaling, Photon/serverless, OPTIMIZE/VACUUM)
• Semantic layer or metrics engineering experience
• Cloud-native analytics platforms (Azure preferred)
If this is relevant to your experience, please apply with your CV and we’ll be in touch.
Senior Data Engineer (Databricks) – Contract
📍 London, UK (2/3 days on site)
💷 £400 per day (Inside IR35)
🕒 6-month contract (Extendable)
Travel: Occasional travel to Dublin may be required
Our client, a top global organization, is seeking a Senior Data Engineer to design, build, and operate production-grade data products across customer, commercial, financial, sales, and enterprise data domains. This role is strongly focused on Databricks-based engineering, delivering trusted, governed, and scalable datasets that support reporting, analytics, and advanced use cases.
Key Responsibilities
• Design, build, and maintain Databricks pipelines using Delta Lake and Delta Live Tables (DLT)
• Implement medallion architectures (Bronze / Silver / Gold)
• Deliver reusable, well-documented, and discoverable data products
• Ensure pipelines meet non-functional requirements (freshness, latency, scalability, reliability, and cost)
• Own Databricks assets including Jobs/Workflows, notebooks, SQL, and Unity Catalog objects
• Apply Git-based DevOps practices (branching, PRs, CI/CD) and Databricks Asset Bundles (DABs)
• Implement monitoring, alerting, incident response, and root cause analysis
• Support production operations with runbooks and operational standards
• Enforce governance and security using Unity Catalog (lineage, classification, ACLs, row/column-level security)
• Define and maintain data quality rules, expectations, and SLOs
• Support investigation and resolution of data anomalies and production issues
• Partner with Product Owners, Data Engineering Manager, Data Scientists, and business stakeholders to translate business requirements into functional and non-functional data solutions
Essential Skills & Experience
• 6+ years’ experience in data engineering or advanced analytics engineering
• Strong hands-on expertise in Python and SQL
• Proven experience building production pipelines in Databricks
• Solid understanding of data modelling, performance tuning, and cost optimisation
• High attention to detail with strong documentation and process-design skills
Desirable Experience
• Strong Databricks Lakehouse expertise (Delta Lake, DLT, batch & streaming pipelines)
• Lakehouse monitoring, data quality, and observability
• Unity Catalog governance and security in regulated environments
• Databricks DevOps/DataOps with CI/CD and environment promotion
• Performance and cost optimisation (autoscaling, Photon/serverless, OPTIMIZE/VACUUM)
• Semantic layer or metrics engineering experience
• Cloud-native analytics platforms (Azure preferred)
If this is relevant to your experience, please apply with your CV and we’ll be in touch.





