Stott and May

Analytics Consultant

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytics Consultant (Analytics Engineer) in London, offering a 6-month contract at market rate (Inside IR35). Key skills include strong SQL, experience with Snowflake and DBT, and proficiency in Python.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 13, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Inside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure Data Factory #Terraform #Scala #Storytelling #Docker #Datasets #Python #Data Engineering #Agile #FastAPI #Observability #Data Vault #Quality Assurance #Monitoring #Data Science #DevOps #Snowflake #Bash #Strategy #SQL (Structured Query Language) #Azure DevOps #Data Pipeline #dbt (data build tool) #Cloud #Version Control #ADF (Azure Data Factory) #AWS (Amazon Web Services) #Redis #GitHub #Vault #Airflow #Data Lake #"ETL (Extract #Transform #Load)" #Databases #Data Storytelling #Data Architecture #Delta Lake #Azure
Role description
Analytics Engineer Location: London - Hybrid (minimum 2 days per week in the office) Contract Duration: 6 months Day Rate: Market rate (Inside IR35) Overview We are seeking an experienced Analytics Engineer to join a high-performing data function in London. This role sits at the intersection of data engineering, data architecture and analytics, bridging technical delivery with business insight. You will design scalable data models, build robust transformation workflows and deliver high-quality data products in collaboration with Data Architects, Data Scientists, Product Managers and Data Engineers. The focus is on enabling business teams to use internal data assets in a structured, accessible and repeatable manner. As an Analytics Engineer, you will write high-quality SQL, design complex data models and build reliable data workflows. You will translate business requirements into scalable technical solutions and contribute to the organisation’s broader technical strategy. Key Responsibilities β€’ Act as a bridge between engineering teams and stakeholders, translating business needs into technical specifications β€’ Transform and structure raw data into reliable, analysis-ready datasets β€’ Build and maintain scalable data pipelines and workflows β€’ Develop and maintain complex data models representing business processes and entities β€’ Implement data validation and quality assurance processes β€’ Collaborate with data engineers and architects to ensure seamless integration across platforms β€’ Automate repetitive data processes to improve efficiency and scalability β€’ Contribute to the design and delivery of data products aligned to engineering principles and roadmaps β€’ Support architectural decisions and technical design discussions β€’ Deliver clear, high-quality insights and support data storytelling initiatives β€’ Write maintainable, testable code aligned with coding standards and Agile practices Essential Skills & Experience β€’ Strong SQL expertise for transformation, modelling and analysis β€’ Experience with Snowflake, DBT and modern data warehousing approaches β€’ Strong understanding of data modelling techniques including Data Vault β€’ Experience using GitHub and version control systems β€’ Understanding of CI/CD pipelines and service-oriented architecture β€’ Knowledge of cloud platforms (AWS and/or Azure) β€’ Proficiency in Python (primary), SQL and Bash β€’ Experience with modern data architecture frameworks β€’ Strong understanding of relational and non-relational databases β€’ Excellent communication skills, able to explain complex concepts clearly β€’ Strong analytical mindset with the ability to drive business impact through data Technical Environment Languages: Python, SQL, Bash Cloud: Azure, AWS Data & Platforms: Snowflake, Delta Lake, Redis, Azure Data Lake Tools: Airflow, DBT, Docker Infrastructure & Operations: Terraform, GitHub Actions, Azure DevOps, Azure Monitor Desirable Skills β€’ Experience with data platforms such as Snowflake or Azure Data Lake β€’ Experience deploying models as APIs (e.g. FastAPI, Azure Functions) β€’ Knowledge of monitoring, model performance tracking and observability practices β€’ Familiarity with orchestration tools such as Airflow or Azure Data Factory