Technical Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Technical Data Engineer, offering a 6+ month hybrid contract in Northampton or London, with a pay rate of "TBD." Key skills include Databricks, DBT, and Snowflake, along with a strong understanding of data engineering principles.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 16, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Northampton, England, United Kingdom
-
🧠 - Skills detailed
#Scala #Spark (Apache Spark) #Data Engineering #Data Pipeline #Data Analysis #Databricks #Delta Lake #Snowflake #Data Quality #dbt (data build tool) #SQL (Structured Query Language) #Data Modeling #Agile #"ETL (Extract #Transform #Load)"
Role description
Location: Northampton or London / Hybrid 2-3 days on-site 6 Months + We are seeking a highly skilled and communicative Technical Data Engineer to join our team. The ideal candidate will have hands-on experience with modern data platforms and tools including Databricks, DBT, and Snowflake. You will play a key role in designing, developing, and optimizing data pipelines and analytics solutions that drive business insights and decision-making. Key Responsibilities β€’ Design, build, and maintain scalable data pipelines using Databricks and DBT. β€’ Develop and optimize data models and transformations in Snowflake. β€’ Collaborate with cross-functional teams to understand data requirements and deliver robust solutions. β€’ Ensure data quality, integrity, and governance across platforms. β€’ Troubleshoot and resolve data-related issues in a timely manner. β€’ Document processes, workflows, and technical specifications clearly and effectively. Required Skills & Experience Proven hands-on experience with: β€’ Databricks (Spark, Delta Lake, notebooks) β€’ DBT (data modeling, transformations, testing) β€’ Snowflake (SQL, performance tuning, data warehousing) β€’ Strong understanding of data engineering principles and best practices. β€’ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders. β€’ Experience working in agile environments and collaborating with data analysts, scientists, and business teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!