

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in London, on a contract basis, offering competitive pay. Key requirements include expertise in Data Vault 2.0, Snowflake, SQL, ETL/ELT processes, and programming skills in Python or Java.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#BI (Business Intelligence) #Data Engineering #Scala #Programming #Data Pipeline #Java #SQL (Structured Query Language) #Data Vault #dbt (data build tool) #Documentation #Kafka (Apache Kafka) #Data Mart #Python #"ETL (Extract #Transform #Load)" #Data Warehouse #Databases #Data Integration #Snowflake #Vault #Storage
Role description
Title: Data Engineer with Vault 2.0
Location: London
Duration: Contract
Job Description: Data Engineer
We are looking for a forward thinking Data Engineer to build a robust Vault 2.0 architecture on Snowflake, to better serve and deliver valuable data products to our customers (internal and end-client).
Key Responsibilities:
Must have Data Vault 2.0 Implementation:
o Design and implement Data Vault 2.0 architecture on the Snowflake data platform.
o Develop and maintain hubs, links, and satellites in the Data Vault 2.0 model.
Data Integration:
o Build robust and scalable data pipelines to ingest and process data from various sources (e.g., databases, APIs, streaming platforms).
o Implement ETL/ELT processes to ensure efficient data loading, transformation, and storage.
Data Marts:
o Design and build data marts to support business intelligence and analytical requirements.
o Collaborate with business stakeholders to understand data needs and deliver tailored data products.
Performance Optimization:
o Optimize data models, queries, and storage solutions for performance and cost efficiency.
o Monitor and troubleshoot performance issues in data pipelines and data warehouses.
Collaboration & Communication:
o Work closely with analysts, and business stakeholders to gather requirements and deliver data solutions.
Documentation & Standards:
o Create and maintain comprehensive documentation- we use confluence
o Adhere to industry best practices and company standards for data engineering and governance.
Required Skills and Qualifications
Technical Expertise:
o Proven experience with Data Vault 2.0 methodology.
o Hands-on experience with Snowflake data platform.
o Proficiency in SQL and ETL/ELT processes.
o Experience with data streaming technologies (e.g., Kafka, Kinesis).
Programming & Tools:
o Strong programming skills in Python, Java, or similar languages.
o DBT hands on experience would be preferred.
Analytical Skills:
o Strong analytical and problem-solving skills.
o Ability to design scalable and efficient data models.
Communication:
o Independent minded/self starter- We are a small team, with a lot to work out still.
o Excellent written and verbal communication skills.
o Ability to work effectively in a collaborative team environment.
Release Comments: Would suit a data engineer with good exposure to architecture. Vault exposure within FS would be good. A pragmatic self starter, who is comfortable working in a small team.
Title: Data Engineer with Vault 2.0
Location: London
Duration: Contract
Job Description: Data Engineer
We are looking for a forward thinking Data Engineer to build a robust Vault 2.0 architecture on Snowflake, to better serve and deliver valuable data products to our customers (internal and end-client).
Key Responsibilities:
Must have Data Vault 2.0 Implementation:
o Design and implement Data Vault 2.0 architecture on the Snowflake data platform.
o Develop and maintain hubs, links, and satellites in the Data Vault 2.0 model.
Data Integration:
o Build robust and scalable data pipelines to ingest and process data from various sources (e.g., databases, APIs, streaming platforms).
o Implement ETL/ELT processes to ensure efficient data loading, transformation, and storage.
Data Marts:
o Design and build data marts to support business intelligence and analytical requirements.
o Collaborate with business stakeholders to understand data needs and deliver tailored data products.
Performance Optimization:
o Optimize data models, queries, and storage solutions for performance and cost efficiency.
o Monitor and troubleshoot performance issues in data pipelines and data warehouses.
Collaboration & Communication:
o Work closely with analysts, and business stakeholders to gather requirements and deliver data solutions.
Documentation & Standards:
o Create and maintain comprehensive documentation- we use confluence
o Adhere to industry best practices and company standards for data engineering and governance.
Required Skills and Qualifications
Technical Expertise:
o Proven experience with Data Vault 2.0 methodology.
o Hands-on experience with Snowflake data platform.
o Proficiency in SQL and ETL/ELT processes.
o Experience with data streaming technologies (e.g., Kafka, Kinesis).
Programming & Tools:
o Strong programming skills in Python, Java, or similar languages.
o DBT hands on experience would be preferred.
Analytical Skills:
o Strong analytical and problem-solving skills.
o Ability to design scalable and efficient data models.
Communication:
o Independent minded/self starter- We are a small team, with a lot to work out still.
o Excellent written and verbal communication skills.
o Ability to work effectively in a collaborative team environment.
Release Comments: Would suit a data engineer with good exposure to architecture. Vault exposure within FS would be good. A pragmatic self starter, who is comfortable working in a small team.