

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "$XX per hour." Key skills required include Snowflake, dbt, SQL, and a minimum of 5 years of data engineering experience, particularly with ETL to ELT transitions.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 29, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#Data Processing #Data Storage #Programming #Data Governance #Data Integrity #dbt (data build tool) #Data Lifecycle #Database Management #Data Engineering #Scripting #Databases #Data Integration #Data Cleansing #Data Transformations #Compliance #Scala #"ETL (Extract #Transform #Load)" #Storage #Data Pipeline #Security #SAP #Snowflake #Python #SQL (Structured Query Language)
Role description
Data Engineer
Project Overview:
After a recent merger of two large business units, we are embarking on a project to reengineer and migrate their end-to-end reporting requirements (direct involvement with pipelines and systems) and operational systems (indirect involvement with data). This transition includes a shift from existing ETL processes to a modern data infrastructure, Snowflake for database management, and dbt for data transformation to establish robust data pipelines.
We need someone with good DBT, SQL and Snowflake skills and experience of Kimball.
Key Responsibilities:
1. Data Integration and Pipeline Development: Develop, construct, test, and maintain architectures such as databases and large-scale processing systems using Snowflake and dbt for data transformations.
1. ETL to ELT Transition: Transition existing ETL processes to modern ELT processes, ensuring seamless data flow and integration across platforms.
1. Data Cleansing and Alignment: Conduct comprehensive data cleansing to unify, correct, and standardize large data sets, ensuring data integrity across Snowflake, IFS, and SAP ECC 6.0 systems according to designs set by Enterprise Architecture teams.
1. Data Governance and Compliance: Recommending data governance policies and procedures to manage the data lifecycle, ensuring compliance with data protection regulations and best practices.
Required Skills & Experience:
1. Proficiency with Snowflake: In-depth knowledge of Snowflake’s data warehousing solutions, including architecture, security, and data storage optimizations.
1. Experience with dbt (data build tool): Demonstrated capability in using dbt for performing complex data transformations within data pipelines.
1. Strong Background in Data Engineering: Minimum of 5 years of experience in data engineering, with a focus on building scalable and high-performance data infrastructures.
1. Programming Skills: Proficiency in SQL and experience with scripting languages such as Python for data processing.
Data Engineer
Project Overview:
After a recent merger of two large business units, we are embarking on a project to reengineer and migrate their end-to-end reporting requirements (direct involvement with pipelines and systems) and operational systems (indirect involvement with data). This transition includes a shift from existing ETL processes to a modern data infrastructure, Snowflake for database management, and dbt for data transformation to establish robust data pipelines.
We need someone with good DBT, SQL and Snowflake skills and experience of Kimball.
Key Responsibilities:
1. Data Integration and Pipeline Development: Develop, construct, test, and maintain architectures such as databases and large-scale processing systems using Snowflake and dbt for data transformations.
1. ETL to ELT Transition: Transition existing ETL processes to modern ELT processes, ensuring seamless data flow and integration across platforms.
1. Data Cleansing and Alignment: Conduct comprehensive data cleansing to unify, correct, and standardize large data sets, ensuring data integrity across Snowflake, IFS, and SAP ECC 6.0 systems according to designs set by Enterprise Architecture teams.
1. Data Governance and Compliance: Recommending data governance policies and procedures to manage the data lifecycle, ensuring compliance with data protection regulations and best practices.
Required Skills & Experience:
1. Proficiency with Snowflake: In-depth knowledge of Snowflake’s data warehousing solutions, including architecture, security, and data storage optimizations.
1. Experience with dbt (data build tool): Demonstrated capability in using dbt for performing complex data transformations within data pipelines.
1. Strong Background in Data Engineering: Minimum of 5 years of experience in data engineering, with a focus on building scalable and high-performance data infrastructures.
1. Programming Skills: Proficiency in SQL and experience with scripting languages such as Python for data processing.