Queen Square Recruitment

Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler, contracted for 6 months, hybrid in London. Key skills include GCP, BigQuery, data modeling techniques, and strong SQL. Experience with data governance and cloud platforms is essential. IR35 status is inside IR35.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 2, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Vault #Scala #Data Management #Cloud #BigQuery #Datasets #Physical Data Model #"ETL (Extract #Transform #Load)" #Data Ingestion #Storage #Data Architecture #Data Governance #Data Profiling #Clustering #Metadata #Data Quality #ERWin #SQL (Structured Query Language) #GCP (Google Cloud Platform) #AI (Artificial Intelligence) #Dataflow #Vault
Role description
New Exciting role for Data Modeller Location: London (Hybrid – 2 days onsite) Contract: 6 Months IR35 Status: Inside IR35 The Role We are seeking an experienced Data Modeller to define and design the foundational data structures that enable scalable, insight-driven digital ecosystems. You will play a key role in shaping data architectures across complex transformation programmes, ensuring data models support integration, analytics, and AI-driven decision-making. Working with modern cloud platforms and high-quality datasets, you will collaborate closely with engineering, analytics, and architecture teams to build secure, scalable, and well-governed data solutions that drive business value. Your Profile Essential Skills & Experience • Strong experience designing conceptual, logical, and physical data models on cloud platforms (ideally GCP) • Hands-on expertise with BigQuery, including performance optimisation (partitioning, clustering, cost efficiency) • Proven experience with data modelling techniques: 3NF, dimensional (Kimball), Data Vault, and Lakehouse patterns • Ability to translate complex domain requirements (e.g., Customer, Payments, Risk, Finance) into scalable models • Experience creating data products with trusted definitions (facts, dimensions, conformed dimensions) • Strong understanding of data governance, metadata management, lineage, and data quality frameworks • Proficiency in data modelling tools such as ER/Studio, ERWin, PowerDesigner, or similar • Experience managing model versioning, artefacts (ERDs, schema scripts), and change control • Knowledge of GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Storage) and data ingestion patterns • Strong SQL skills and experience in data profiling and validation