

Analytics Engineer Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Contract Analytics Engineer for "X" months, offering a pay rate of "X" per hour. Key skills include SQL, DBT, Redshift/Snowflake, and Airflow. Familiarity with cloud environments and dimensional modeling is essential.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
September 7, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London
-
π§ - Skills detailed
#AWS (Amazon Web Services) #Dimensional Modelling #Unit Testing #dbt (data build tool) #Data Warehouse #Cloud #Redshift #Lean #BI (Business Intelligence) #Python #Scala #Data Engineering #Snowflake #Data Pipeline #Microsoft Power BI #Schema Design #Data Mart #Looker #Airflow #SQL (Structured Query Language)
Role description
THE COMPANY
This global business is rapidly scaling its competitions and digital analytics capability, creating a brand-new vertical for revenue management. With reporting still heavily manual in the UK market, the company is investing in building a modern, scalable data infrastructure - and they're looking for a contractor to help lay the foundations. THE ROLE
As a Contract Analytics Engineer, you'll play a pivotal role in setting up the data warehouse and enabling self-serve reporting for the business. Your responsibilities will include:
β’ Supporting the design of the data model in Redshift.
β’ Implementing a Medallion architecture (bronze, silver, gold).
β’ Using DBT to model data, configure tables, and set up tests (including unit testing).
β’ Creating a data mart and star schema for financial and revenue reporting.
β’ Advising on orchestration best practices using Airflow.
β’ Optimising data pipelines for downstream BI tools (Power BI / Looker).
This role won't require building from scratch, but you'll need a solid understanding of incremental loads and strong problem-solving ability to structure data efficiently. Success will be measured by the delivery of a functioning data mart and reporting models, not dashboarding. YOUR SKILLS & EXPERIENCE
The successful candidate will have:
β’ Strong SQL and DBT experience.
β’ Familiarity with Kimball methodology, dimensional modelling, and star schema design.
β’ Proven experience with Redshift or Snowflake.
β’ Strong background in cloud-based data environments (AWS preferred).
β’ Hands-on experience with Airflow for orchestration.
β’ (Nice-to-have) Python for data engineering tasks.
β’ (Nice-to-have) Optimisation for BI tools such as Power BI or Looker.
Soft skills:
β’ Strong collaboration with both technical and business stakeholders.
β’ Proactive and structured approach to delivery.
β’ Ability to work in a lean team with multiple source systems
THE COMPANY
This global business is rapidly scaling its competitions and digital analytics capability, creating a brand-new vertical for revenue management. With reporting still heavily manual in the UK market, the company is investing in building a modern, scalable data infrastructure - and they're looking for a contractor to help lay the foundations. THE ROLE
As a Contract Analytics Engineer, you'll play a pivotal role in setting up the data warehouse and enabling self-serve reporting for the business. Your responsibilities will include:
β’ Supporting the design of the data model in Redshift.
β’ Implementing a Medallion architecture (bronze, silver, gold).
β’ Using DBT to model data, configure tables, and set up tests (including unit testing).
β’ Creating a data mart and star schema for financial and revenue reporting.
β’ Advising on orchestration best practices using Airflow.
β’ Optimising data pipelines for downstream BI tools (Power BI / Looker).
This role won't require building from scratch, but you'll need a solid understanding of incremental loads and strong problem-solving ability to structure data efficiently. Success will be measured by the delivery of a functioning data mart and reporting models, not dashboarding. YOUR SKILLS & EXPERIENCE
The successful candidate will have:
β’ Strong SQL and DBT experience.
β’ Familiarity with Kimball methodology, dimensional modelling, and star schema design.
β’ Proven experience with Redshift or Snowflake.
β’ Strong background in cloud-based data environments (AWS preferred).
β’ Hands-on experience with Airflow for orchestration.
β’ (Nice-to-have) Python for data engineering tasks.
β’ (Nice-to-have) Optimisation for BI tools such as Power BI or Looker.
Soft skills:
β’ Strong collaboration with both technical and business stakeholders.
β’ Proactive and structured approach to delivery.
β’ Ability to work in a lean team with multiple source systems