Stott and May

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in London (Hybrid, 6 months) with a market day rate. Key skills include ETL/ELT pipelines, Snowflake, DBT, AWS/Azure, and advanced SQL. Experience in data governance and cloud architecture is essential.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 11, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Inside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Database Design #Bash #Python #Leadership #Vault #Data Warehouse #Data Quality #Azure #Redis #Azure Data Factory #GitHub #DevOps #Automation #Airflow #Data Vault #Data Architecture #Data Engineering #Observability #Terraform #Scala #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Monitoring #Version Control #Data Lake #Data Governance #SQL (Structured Query Language) #AWS (Amazon Web Services) #BI (Business Intelligence) #Data Integrity #Data Pipeline #Data Science #Compliance #Alation #SQL Queries #Azure DevOps #dbt (data build tool) #Cloud #Delta Lake #Deployment #Snowflake
Role description
Job Description Job Title: Senior Data Engineer Location: London (Hybrid – minimum 2 days per week in the office) Day Rate: Market rate (Inside IR35) Contract Duration: 6 months Role Overview We are seeking an experienced Senior Data Engineer to design, develop and maintain scalable data pipelines that ensure high-quality, reliable data is available for business decision-making. You will work closely with data architects, product teams, analysts and data scientists to deliver robust data solutions that power analytics, reporting and advanced data products across our retail organisation. This role requires strong hands-on experience in modern cloud data platforms including Snowflake, DBT and AWS/Azure, alongside expertise in data modelling, ETL/ELT processes and pipeline orchestration. You will also act as a technical mentor within a collaborative and innovative data engineering team. Key Responsibilities β€’ Design, develop, optimise and maintain scalable ETL/ELT data pipelines using modern cloud technologies. β€’ Monitor, troubleshoot and enhance production data pipelines to ensure performance, reliability and data integrity. β€’ Write and optimise complex SQL queries to support high-performance analytics workloads. β€’ Implement flexible Data Vault models in Snowflake to support enterprise-scale analytics and business intelligence. β€’ Build and maintain scalable cloud-based data solutions using Snowflake, DBT, AWS and/or Azure. β€’ Support infrastructure and deployment automation using Terraform and CI/CD platforms. β€’ Implement and enforce data quality controls and governance processes to ensure accurate and consistent data flows. β€’ Support governance frameworks such as Alation and ensure adherence to regulatory and organisational standards. β€’ Provide technical leadership, mentor junior engineers and promote engineering best practice. β€’ Collaborate with data scientists to deploy analytical and AI models in production environments. β€’ Engage with business and product stakeholders to translate requirements into scalable technical solutions. β€’ Contribute to architecture discussions and recommend improvements based on emerging data engineering trends. Essential Skills And Experience β€’ Strong experience building and maintaining ETL/ELT pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow. β€’ Proven experience designing and implementing data solutions on cloud-based architectures. β€’ Experience working with cloud data warehouses and analytics platforms such as Snowflake and AWS or Azure. β€’ Proficiency using GitHub for version control, collaboration and release management. β€’ Experience implementing data governance frameworks, including data quality management and compliance practices. β€’ Advanced SQL skills including complex query writing, optimisation and analytics-focused database design. β€’ Strong communication and stakeholder engagement skills, with the ability to present technical concepts clearly. β€’ Excellent problem-solving skills and the ability to translate business requirements into technical solutions. Languages: Python (primary), SQL, Bash Cloud: Azure, AWS Tools: Airflow, DBT Data Platforms: Snowflake, Delta Lake, Redis, Azure Data Lake Infrastructure and Operations: Terraform, GitHub Actions, Azure DevOps, Azure Monitor Desirable Skills And Experience β€’ Experience with enterprise data platforms such as Snowflake and Azure Data Lake. β€’ Understanding of monitoring, model performance tracking and observability best practices. β€’ Familiarity with orchestration tools such as Airflow or Azure Data Factory.