Queen Square Recruitment

Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytics Engineer on a 6-month contract, paying £500 per day, hybrid in Central London. Key skills include Snowflake, DBT, Python, SQL, and data modeling. Cloud experience with AWS or Azure is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
February 11, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Python #Bash #Vault #Azure #Redis #GitHub #Agile #DevOps #Airflow #Data Vault #Data Storytelling #Data Engineering #Observability #Terraform #Scala #"ETL (Extract #Transform #Load)" #Storage #Docker #Monitoring #Storytelling #Version Control #Data Lake #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Pipeline #Data Science #Azure DevOps #dbt (data build tool) #Cloud #Delta Lake #FastAPI #Databases #Snowflake
Role description
Analytics Engineer – Contract Role Location: Hybrid – 2 days per week onsite in Central London Start: ASAP Duration: 6 months initially Rate: £500 per day (Inside IR35) The Role Our client is seeking an Analytics Engineer to design scalable data models, build robust data workflows, write high‑quality SQL, and collaborate across Data Engineering, Architecture, Product, and Data Science. You’ll enable the business to access clean, reliable, analytics‑ready data and support informed decision‑making. Key Responsibilities • Act as the bridge between stakeholders and engineering teams, translating business requirements into technical specifications. • Build and maintain data pipelines; transform raw data into structured, analysable formats. • Develop data validation processes to ensure quality and accuracy. • Collaborate with data engineers, architects, and product teams to support integrated data solutions. • Create and maintain complex data models (e.g., data vault, warehousing). • Automate repetitive data tasks to improve scalability and efficiency. • Contribute to system and data solution design; support technical decision‑making. • Write high-quality code following TDD/BDD and engineering best practices. • Deliver clear, high-impact insights through data storytelling. • Influence and uphold engineering standards, design principles, and roadmap alignment. Essential Skills & Experience • Hands-on with Snowflake, DBT, Data Modelling, Data Vault, Data Warehousing, GitHub. • Knowledge of version control, CI/CD, and service‑oriented architecture. • Cloud experience: AWS or Azure. • Languages: Python (primary), SQL, Bash. • Tools: Airflow, DBT, Docker. • Data/Storage: Snowflake, Delta Lake, Redis, Azure Data Lake. • Infra/Ops: Terraform, GitHub Actions, Azure DevOps, Azure Monitor. • Understanding of relational and non‑relational databases. • Ability to deliver maintainable, long‑term solutions using Agile ways of working. • Awareness of engineering standards and frameworks, and when to suggest new ones. • Strong collaboration and stakeholder engagement skills. Desirable Skills • Deploying models as APIs (FastAPI, Azure Functions). • Monitoring, observability, and model performance tracking. If you have the required skills & experience, please apply promptly to be considered.