

Gazelle Global
Data Modeler
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler with a contract length of "unknown" and a pay rate of "unknown." Key skills include GCP, BigQuery, and expertise in data modelling approaches. Experience in the banking domain is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 2, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Vault #Scala #Normalization #Indexing #Slowly Changing Dimensions #JSON (JavaScript Object Notation) #Cloud #BigQuery #Stories #Physical Data Model #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Storage #Snowflake #Data Integration #Clustering #ERWin #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Data Design #YAML (YAML Ain't Markup Language) #AI (Artificial Intelligence) #Vault #Strategy #BI (Business Intelligence)
Role description
Opportunity for Data Modeller / Data Designer
β’ This role empowers you to shape end-to-end data ecosystemsβaccelerating delivery, enhancing data clarity, strengthening operational resilience, and driving organisations toward a more insight-rich, data-enabled future.
β’ You will define the data blueprints and foundational models that underpin how customers in dynamic, data-intensive industries operate, scale, and innovate.
β’ You will design robust, future-ready data models that enable seamless integration, advanced analytics, and AI-driven decision making across complex digital transformation programmes.
Essential Skills & Experience - Data Modeller / Data Designer
β’ Experienced Data Modeller, Data Designer, Data Specialist or similar
β’ Proven experience delivering conceptual, logical, and physical data models for cloud data platforms, ideally GCP
β’ Strong hands-on modelling for Big Query (analytical/columnar patterns, denormalization strategy, partitioning & clustering considerations)
β’ Expertise in data modelling approaches: 3NF, dimensional (Kimball), Data Vault, and hybrid patterns for Lakehouse designs
β’ Maintain versioned model artefacts (ERDs, schema scripts, JSON/YAML specs) and change logs; manage controlled evolution of models.
β’ Ability to translate banking domain requirements (Customer, Accounts, Payments, Credit, Risk, Finance) into scalable canonical models
β’ Strong understanding of BigQuery performance and cost optimisation impacts driven by modelling choices (query patterns, storage, scan costs)
β’ Experience designing data products for analytics and reporting with trusted definitions (facts, dimensions, SCD, conformed dimensions)
β’ Proficiency with data modelling tools such as ER/Studio, PowerDesigner, ERWin, SQL Developer Data Modeler, or equivalent cloud-native tools.
Key Responsibilities - Data Modeller / Data Designer
β’ Define and maintain conceptual, logical, and physical data models that accurately reflect business processes and support analytics, AI/ML, and operational needs.
β’ Translate business requirements into robust data entities, attributes, relationships, and constraints; ensure traceability from requirements to models.
β’ Establish and enforce GDM modelling standards and naming conventions (e.g., normalization, dimensional/star/snowflake patterns, data vault where appropriate).
β’ Design dimensional models (facts, dimensions, hierarchies, slowly changing dimensions) for BI/analytics and performance at scale.
β’ Create and manage canonical data models and semantic layers to enable consistent metrics and self-service analytics across domains.
β’ Optimise models for performance and cost (partitioning, clustering, indexing, compression, surrogate keys, distribution strategies).
β’ Drive data integration design across sources (CDC, event streaming, APIs), mapping source-to-target, resolving conflicts, and handling historical changes.
β’ Support AI/ML readiness by modelling features, aggregations, and histories; collaborate on feature stores and model input/output schemas.
Opportunity for Data Modeller / Data Designer
β’ This role empowers you to shape end-to-end data ecosystemsβaccelerating delivery, enhancing data clarity, strengthening operational resilience, and driving organisations toward a more insight-rich, data-enabled future.
β’ You will define the data blueprints and foundational models that underpin how customers in dynamic, data-intensive industries operate, scale, and innovate.
β’ You will design robust, future-ready data models that enable seamless integration, advanced analytics, and AI-driven decision making across complex digital transformation programmes.
Essential Skills & Experience - Data Modeller / Data Designer
β’ Experienced Data Modeller, Data Designer, Data Specialist or similar
β’ Proven experience delivering conceptual, logical, and physical data models for cloud data platforms, ideally GCP
β’ Strong hands-on modelling for Big Query (analytical/columnar patterns, denormalization strategy, partitioning & clustering considerations)
β’ Expertise in data modelling approaches: 3NF, dimensional (Kimball), Data Vault, and hybrid patterns for Lakehouse designs
β’ Maintain versioned model artefacts (ERDs, schema scripts, JSON/YAML specs) and change logs; manage controlled evolution of models.
β’ Ability to translate banking domain requirements (Customer, Accounts, Payments, Credit, Risk, Finance) into scalable canonical models
β’ Strong understanding of BigQuery performance and cost optimisation impacts driven by modelling choices (query patterns, storage, scan costs)
β’ Experience designing data products for analytics and reporting with trusted definitions (facts, dimensions, SCD, conformed dimensions)
β’ Proficiency with data modelling tools such as ER/Studio, PowerDesigner, ERWin, SQL Developer Data Modeler, or equivalent cloud-native tools.
Key Responsibilities - Data Modeller / Data Designer
β’ Define and maintain conceptual, logical, and physical data models that accurately reflect business processes and support analytics, AI/ML, and operational needs.
β’ Translate business requirements into robust data entities, attributes, relationships, and constraints; ensure traceability from requirements to models.
β’ Establish and enforce GDM modelling standards and naming conventions (e.g., normalization, dimensional/star/snowflake patterns, data vault where appropriate).
β’ Design dimensional models (facts, dimensions, hierarchies, slowly changing dimensions) for BI/analytics and performance at scale.
β’ Create and manage canonical data models and semantic layers to enable consistent metrics and self-service analytics across domains.
β’ Optimise models for performance and cost (partitioning, clustering, indexing, compression, surrogate keys, distribution strategies).
β’ Drive data integration design across sources (CDC, event streaming, APIs), mapping source-to-target, resolving conflicts, and handling historical changes.
β’ Support AI/ML readiness by modelling features, aggregations, and histories; collaborate on feature stores and model input/output schemas.






