

Hays
Data Engineer - Azure / GCP, Data Lake, Snowflake
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in Azure/GCP and Data Lakes, offering up to £700 per day for 6 months in London/Hybrid. Requires experience in large regulated organizations, strong communication skills, and data analysis capabilities.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
700
-
🗓️ - Date
October 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#GraphQL #GCP (Google Cloud Platform) #Azure #Data Lake #"ETL (Extract #Transform #Load)" #Data Bricks #Data Architecture #Snowflake #Cloud #Security #Data Engineering #Data Management
Role description
Data Engineer - Azure / GCP, Data Lake, Snowflake
Up to £700 per day (Inside IR35)
London / Hybrid (1-2 days per week hybrid working)
6 months
I am currently working with an instantly recognisable, high profile client who urgently require a Data Engineer with expertise in Azure / GCP and Data Lakes to join a major transformation programme, whilst expanding Data sources and identifying more Data sources to help produce more metrics to drive Data capability across the entire organisation, helping bridge the gap between Data Engineering and Security.
Key Requirements:
• Proven experience as a Data Engineer in a large, complex, regulated organisation
• Expertise with Cloud Platforms (Azure and GCP preferred)
• Previous experience of working with Data Lakes
• Demonstrable experience of ingesting, extracting and analysing Data from diverse sources
• Ability to create a centralised and standardised view from using Data from across multiple Business / Market Units across the entire organisation
• Understanding of future hosting model(s)
• Capability to give Market Units some guidance whilst understanding Data capability, working with vendors / 3rd parties and working out what more can be done
• Strong communication skills and ability to work autonomously and drive innovation
Nice to have:
• Familiarity with Data Architecture
• Exposure to Cyber Security tooling or working closely with InfoSec / Risk teams
• Understanding of Data Management frameworks (DCAM, DMBOK)
• Working knowledge of GraphQL / Data Bricks / Snowflake
• Previous experience of working with Medical / Healthcare Data
• Immediate availability
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
Data Engineer - Azure / GCP, Data Lake, Snowflake
Up to £700 per day (Inside IR35)
London / Hybrid (1-2 days per week hybrid working)
6 months
I am currently working with an instantly recognisable, high profile client who urgently require a Data Engineer with expertise in Azure / GCP and Data Lakes to join a major transformation programme, whilst expanding Data sources and identifying more Data sources to help produce more metrics to drive Data capability across the entire organisation, helping bridge the gap between Data Engineering and Security.
Key Requirements:
• Proven experience as a Data Engineer in a large, complex, regulated organisation
• Expertise with Cloud Platforms (Azure and GCP preferred)
• Previous experience of working with Data Lakes
• Demonstrable experience of ingesting, extracting and analysing Data from diverse sources
• Ability to create a centralised and standardised view from using Data from across multiple Business / Market Units across the entire organisation
• Understanding of future hosting model(s)
• Capability to give Market Units some guidance whilst understanding Data capability, working with vendors / 3rd parties and working out what more can be done
• Strong communication skills and ability to work autonomously and drive innovation
Nice to have:
• Familiarity with Data Architecture
• Exposure to Cyber Security tooling or working closely with InfoSec / Risk teams
• Understanding of Data Management frameworks (DCAM, DMBOK)
• Working knowledge of GraphQL / Data Bricks / Snowflake
• Previous experience of working with Medical / Healthcare Data
• Immediate availability
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk