

hackajob
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on an 18-month fixed-term contract, offering a competitive pay rate. Key skills include SQL, Databricks, Python, and ETL processes. Experience with data quality and leadership is essential. Remote work is available.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 11, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Unknown
-
π - Contract
Fixed Term
-
π - Security
Unknown
-
π - Location detailed
United Kingdom
-
π§ - Skills detailed
#Databricks #Data Quality #Data Design #Data Engineering #Security #Scala #"ETL (Extract #Transform #Load)" #Monitoring #Observability #Python #SQL (Structured Query Language) #Leadership #Data Pipeline #Delta Lake
Role description
hackajob is collaborating with Evri to connect them with exceptional professionals for this role.
Senior Data Engineer β Design the Future of Our Data Platform - 18 month Fixed Term Contract
Ready to lead, influence, and build at scale?
If youβre an experienced Data Engineer who thrives on solving complex problems, shaping architecture, and raising engineering standards, this is a role where your impact will be felt across the entire business.
As a Senior Data Engineer, youβll design and deliver high-quality, governed data products on our Databricks Lakehouse platform. Blending hands-on engineering with architectural thinking, youβll help define how data is built, optimised, and consumed - today and in the future.
What Youβll Be Doing
Youβll take ownership of complex data pipelines and platform components, ensuring solutions are scalable, maintainable, and aligned with enterprise governance.
Working closely with architects, analysts, and business stakeholders, youβll translate requirements into robust data designs, while also mentoring junior engineers and contributing to shared standards, frameworks, and best practices.
This role is central to how we evolve our data platform β from ingestion and modelling through to quality, observability, and cost-efficient performance.
Responsibilities
Design and implement complex ETL/ELT pipelines using Databricks (Python, SQL, DLT).
Build and optimise Delta Lake tables with effective partitioning and performance strategies.
Contribute to Lakehouse architecture design, ingestion patterns, and data product boundaries.
Ensure solutions align with governance, security, and lineage standards (Unity Catalog).
Implement automated data quality testing, monitoring, and observability.
Optimise cluster usage, job orchestration, and cost efficiency.
Provide technical leadership and mentoring to other engineers.
Drive continuous improvement through reusable components, frameworks, and innovation.
Interested? Hereβs What You&rsquo
hackajob is collaborating with Evri to connect them with exceptional professionals for this role.
Senior Data Engineer β Design the Future of Our Data Platform - 18 month Fixed Term Contract
Ready to lead, influence, and build at scale?
If youβre an experienced Data Engineer who thrives on solving complex problems, shaping architecture, and raising engineering standards, this is a role where your impact will be felt across the entire business.
As a Senior Data Engineer, youβll design and deliver high-quality, governed data products on our Databricks Lakehouse platform. Blending hands-on engineering with architectural thinking, youβll help define how data is built, optimised, and consumed - today and in the future.
What Youβll Be Doing
Youβll take ownership of complex data pipelines and platform components, ensuring solutions are scalable, maintainable, and aligned with enterprise governance.
Working closely with architects, analysts, and business stakeholders, youβll translate requirements into robust data designs, while also mentoring junior engineers and contributing to shared standards, frameworks, and best practices.
This role is central to how we evolve our data platform β from ingestion and modelling through to quality, observability, and cost-efficient performance.
Responsibilities
Design and implement complex ETL/ELT pipelines using Databricks (Python, SQL, DLT).
Build and optimise Delta Lake tables with effective partitioning and performance strategies.
Contribute to Lakehouse architecture design, ingestion patterns, and data product boundaries.
Ensure solutions align with governance, security, and lineage standards (Unity Catalog).
Implement automated data quality testing, monitoring, and observability.
Optimise cluster usage, job orchestration, and cost efficiency.
Provide technical leadership and mentoring to other engineers.
Drive continuous improvement through reusable components, frameworks, and innovation.
Interested? Hereβs What You&rsquo






