

TESTQ Technologies Limited
Data Warehouse Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Modeler with a contract length of "unknown" and a pay rate of "unknown." Required skills include Snowflake, SQL, and DBT. Candidates should have 7 years of experience in data engineering within enterprise environments.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Basildon, England, United Kingdom
-
🧠 - Skills detailed
#Snowflake #SSIS (SQL Server Integration Services) #Cloud #Informatica #Data Quality #Spark (Apache Spark) #Clustering #SQL (Structured Query Language) #Data Modeling #Documentation #Data Vault #Data Mapping #Data Engineering #Data Warehouse #Computer Science #Data Integration #PySpark #Vault #"ETL (Extract #Transform #Load)" #Data Ingestion #Physical Data Model #Talend #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #dbt (data build tool) #ODI (Oracle Data Integrator) #AI (Artificial Intelligence)
Role description
Mandatory Skills : Snowflake
Overview
• We are seeking a talented and experienced Data Warehouse Modeler and Engineer DBT Snowflake to join the Data Solutions domain in EMEA
• As a key contributor you will be responsible for designing and implementing endtoend data integration and transformation solutions on Snowflake
• This role combines ETLELT design data mapping data modeling and handson data engineering using dbt
Key Responsibilities
• Design and implement ETL ELT pipelines for data ingestion transformation and consolidation
• Perform data mapping from source systems to target Snowflake structures
• Design and maintain conceptual logical and physical data models staging integration marts
• Develop test and maintain dbt models following best practices
• Optimize Snowflake performance query design materializations clustering
• Ensure data quality consistency lineage and documentation
• Collaborate with engineering analytics and platform teams to align on data patterns and standards
Technical Qualifications
• Bachelors or Masters degree in Computer Science Data Engineering or a related technical field
• 7 years of experience in a similar role within a fastpaced enterprisescale environment
• Strong proficiency in SQL and handson experience with DBT Core or Cloud or any other traditional ETL Tools ODI Informatica Talend SSIS etc
• Snowflake data engineering
• SQL data modeling Kimball Inmon Data Vault concepts ETL ELT design and data mapping
• Data quality governance and performance optimization
• A solid understanding of endtoend data ingestion and transformation processes
• AWS data stack S3 Glue EMRPySpark Iceberg nice to have
NonTechnical Skills
• Demonstrated commitment ownership and accountability
• Strong attention to detail with a proactive approach to identifying and resolving problems
• A genuine interest in emerging data technologies and a curiosity about continuous improvements
• Excellent verbal and written communication skills
• AI Empowered Experience
Mandatory Skills : Snowflake
Overview
• We are seeking a talented and experienced Data Warehouse Modeler and Engineer DBT Snowflake to join the Data Solutions domain in EMEA
• As a key contributor you will be responsible for designing and implementing endtoend data integration and transformation solutions on Snowflake
• This role combines ETLELT design data mapping data modeling and handson data engineering using dbt
Key Responsibilities
• Design and implement ETL ELT pipelines for data ingestion transformation and consolidation
• Perform data mapping from source systems to target Snowflake structures
• Design and maintain conceptual logical and physical data models staging integration marts
• Develop test and maintain dbt models following best practices
• Optimize Snowflake performance query design materializations clustering
• Ensure data quality consistency lineage and documentation
• Collaborate with engineering analytics and platform teams to align on data patterns and standards
Technical Qualifications
• Bachelors or Masters degree in Computer Science Data Engineering or a related technical field
• 7 years of experience in a similar role within a fastpaced enterprisescale environment
• Strong proficiency in SQL and handson experience with DBT Core or Cloud or any other traditional ETL Tools ODI Informatica Talend SSIS etc
• Snowflake data engineering
• SQL data modeling Kimball Inmon Data Vault concepts ETL ELT design and data mapping
• Data quality governance and performance optimization
• A solid understanding of endtoend data ingestion and transformation processes
• AWS data stack S3 Glue EMRPySpark Iceberg nice to have
NonTechnical Skills
• Demonstrated commitment ownership and accountability
• Strong attention to detail with a proactive approach to identifying and resolving problems
• A genuine interest in emerging data technologies and a curiosity about continuous improvements
• Excellent verbal and written communication skills
• AI Empowered Experience






