

GIOS Technology
Senior Data Engineer (Databricks/PySpark)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Databricks/PySpark) with a contract length of "unknown", offering a pay rate of "unknown". It requires extensive experience in Databricks, Spark/PySpark, and advanced SQL skills, with a preference for Databricks certification. Hybrid work in London.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
400
-
🗓️ - Date
October 17, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Delta Lake #Scala #Data Warehouse #Python #SQL (Structured Query Language) #Databricks #PySpark #Apache Spark #Data Engineering #Data Lake #"ACID (Atomicity #Consistency #Isolation #Durability)" #"ETL (Extract #Transform #Load)" #Data Pipeline #Spark (Apache Spark)
Role description
We are hiring for Senior Data Engineer (Databricks/PySpark)
Location : Hybrid - 3 days in London Office
• Extensive, hands-on experience as a Data Engineer using the Databricks Lakehouse Platform.
• Lead the architecture and implementation of robust, scalable ETL/ELT data pipelines on Databricks using Spark/PySpark.
• Expert proficiency in Python and Spark/PySpark for data transformation
• Deep understanding of implementing and optimizing Delta Lake features (ACID properties, Time Travel, partitioning).
• Advanced SQL skills for complex data querying and manipulation.
• Preferred Databricks Certified Data Engineer Professional
Key Skills :Databricks Lakehouse / Data lakes / Data warehouses / ACID / Delta Lake / Apache Spark / data pipelines / CI/CD
We are hiring for Senior Data Engineer (Databricks/PySpark)
Location : Hybrid - 3 days in London Office
• Extensive, hands-on experience as a Data Engineer using the Databricks Lakehouse Platform.
• Lead the architecture and implementation of robust, scalable ETL/ELT data pipelines on Databricks using Spark/PySpark.
• Expert proficiency in Python and Spark/PySpark for data transformation
• Deep understanding of implementing and optimizing Delta Lake features (ACID properties, Time Travel, partitioning).
• Advanced SQL skills for complex data querying and manipulation.
• Preferred Databricks Certified Data Engineer Professional
Key Skills :Databricks Lakehouse / Data lakes / Data warehouses / ACID / Delta Lake / Apache Spark / data pipelines / CI/CD