

RED Global
Snowflake Data Warehouse Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Warehouse Engineer on a 6-month contract (with extension potential) in the UK/EMEA. Key skills include Snowflake, dbt, SQL, ETL/ELT design, and enterprise-scale data experience. Full-time workload; start ASAP.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Essex, England, United Kingdom
-
🧠 - Skills detailed
#Talend #Documentation #AWS (Amazon Web Services) #Data Ingestion #"ETL (Extract #Transform #Load)" #Vault #SSIS (SQL Server Integration Services) #Data Quality #Data Warehouse #Spark (Apache Spark) #Data Mapping #Data Integration #S3 (Amazon Simple Storage Service) #Data Mart #dbt (data build tool) #PySpark #ODI (Oracle Data Integrator) #Data Engineering #Informatica #Snowflake #SQL (Structured Query Language) #Cloud #Data Vault
Role description
RED is currently looking for a Snowflake Data Warehouse Engineer for an initial 6-month contract with a high chance of extension.
This role is suited to a hands-on Data Engineer with strong experience across Snowflake, dbt, SQL, data modelling, ETL/ELT design, and data warehouse development. You will be responsible for designing and delivering end-to-end data integration and transformation solutions within an enterprise-scale data environment.
Contract Details
• Role: Data Warehouse Engineer
• Contract: 6 Months+
• Start Date: ASAP
• Location: UK / EMEA
• Work Model: TBC
• Contract Type: TBC
• Workload: Full-time
Key Skills Required
• Strong experience as a Data Engineer / Data Warehouse Engineer
• Snowflake data engineering experience
• Strong SQL development experience
• Hands-on experience with dbt Core or dbt Cloud
• ETL / ELT pipeline design and development
• Data mapping from source systems into Snowflake
• Data modelling experience: conceptual, logical, and physical models
• Experience with Kimball, Inmon, or Data Vault modelling approaches
• Data warehouse design, including staging, integration, and data marts
• Performance optimisation within Snowflake
• Data quality, governance, lineage, and documentation experience
• Experience working with enterprise-scale data environments
Nice to Have
• AWS data stack experience
• S3, Glue, EMR, PySpark, or Iceberg
• Traditional ETL tooling experience such as ODI, Informatica, Talend, or SSIS
• Strong understanding of end-to-end data ingestion and transformation processes
You will be working closely with engineering, analytics, and platform teams to define data patterns, build reliable transformation pipelines, and ensure data is structured, governed, and optimised for business use.
If this is something you are interested in, please send me an up-to-date CV and we can discuss the role in more detail.
RED is currently looking for a Snowflake Data Warehouse Engineer for an initial 6-month contract with a high chance of extension.
This role is suited to a hands-on Data Engineer with strong experience across Snowflake, dbt, SQL, data modelling, ETL/ELT design, and data warehouse development. You will be responsible for designing and delivering end-to-end data integration and transformation solutions within an enterprise-scale data environment.
Contract Details
• Role: Data Warehouse Engineer
• Contract: 6 Months+
• Start Date: ASAP
• Location: UK / EMEA
• Work Model: TBC
• Contract Type: TBC
• Workload: Full-time
Key Skills Required
• Strong experience as a Data Engineer / Data Warehouse Engineer
• Snowflake data engineering experience
• Strong SQL development experience
• Hands-on experience with dbt Core or dbt Cloud
• ETL / ELT pipeline design and development
• Data mapping from source systems into Snowflake
• Data modelling experience: conceptual, logical, and physical models
• Experience with Kimball, Inmon, or Data Vault modelling approaches
• Data warehouse design, including staging, integration, and data marts
• Performance optimisation within Snowflake
• Data quality, governance, lineage, and documentation experience
• Experience working with enterprise-scale data environments
Nice to Have
• AWS data stack experience
• S3, Glue, EMR, PySpark, or Iceberg
• Traditional ETL tooling experience such as ODI, Informatica, Talend, or SSIS
• Strong understanding of end-to-end data ingestion and transformation processes
You will be working closely with engineering, analytics, and platform teams to define data patterns, build reliable transformation pipelines, and ensure data is structured, governed, and optimised for business use.
If this is something you are interested in, please send me an up-to-date CV and we can discuss the role in more detail.






