Senior Databricks Engineer (Data Engineer)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Databricks Engineer (Data Engineer) on a £525/day contract for 6 months, fully remote. Key skills include Databricks, Spark, SQL, and big data technologies. Experience in data pipeline design, optimization, and compliance is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
525
-
🗓️ - Date discovered
September 11, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Big Data #Kafka (Apache Kafka) #Spark (Apache Spark) #Agile #Databricks #Data Science #"ETL (Extract #Transform #Load)" #Scala #Data Bricks #SQL (Structured Query Language) #Hadoop #Data Security #Security #Data Pipeline #Databases #Data Engineering #Compliance #Data Quality
Role description
Senior Databricks Engineer (Data engineer) Rate- £525 per day Outside ir35 contract, work from home Our client is a global software enterprise looking for a Senior Data Engineer (Data bricks) for their Data Engineering team. Should have experience in designing, building, and maintaining data pipelines and infrastructure within the Databricks environment. Responsibilities: - • Work on standards principles and best practices- design, develop, and maintain scalable and robust data pipelines on Databricks. • Improve performance, fix broken pipelines- Optimize and troubleshoot existing data pipelines for performance and reliability. • Review the current databricks architecture • Redesign the Databricks architecture • Collaborate with data scientists and analysts to understand data requirements and deliver solutions. • Ensure data quality and integrity across various data sources. • Implement data security and compliance best practices. • Monitor data pipeline performance and conduct necessary maintenance and updates. • Document data pipeline processes and technical specifications. Required skillsets: - • Proficiency with Databricks and Spark. • Strong SQL skills and experience with relational databases. • Experience with big data technologies (e.g., Hadoop, Kafka). • Knowledge of data warehousing concepts and ETL processes. • Excellent problem-solving and analytical skills. • Should have both Waterfall and Agile background. • Experience in reviewing user guide, technical guide, ETL, and Data Model documents • Define end-to-end workflow