

X4 Technology
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 6-month remote contract, offering a pay rate DOE (Outside IR35). Key skills required include Azure, Snowflake, Informatica, and Databricks, with a focus on building scalable data pipelines and cloud systems.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Security #Scala #Informatica #Azure SQL #Data Engineering #BI (Business Intelligence) #Azure #Snowflake #Compliance #Data Science #Python #Cloud #Databricks #Security #Programming #Data Storage #SQL (Structured Query Language) #Data Pipeline #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Data Processing #Storage
Role description
Job Title: Data Engineer
Rate: DOE (Outside IR35)
Location: Remote
Contract Length: 6 months
A Microsoft-partnered consultancy client of ours have recently secured a project with one of their key clients. They are seeking an Databricks focused Data Engineer to work with them on a a contract basis. This is an exciting opportunity to work on cutting-edge data projects, building scalable data pipelines and cloud-based systems that deliver real impact.
Experience with: Snowflake, Informatica, and Azure is a must.
Key Responsibilities:
• Design, develop, and maintain scalable and high-performance data pipelines on Azure.
• Optimise data storage and retrieval processes to enhance performance and reduce costs.
• Collaborate with data scientists, analysts, and product teams to deliver data-driven solutions that support business objectives.
• Work with both structured and unstructured data to assist with business intelligence, analytics, and machine learning initiatives.
• Ensure data security, governance, and compliance within the cloud environment.
• Troubleshoot and optimise existing cloud-based data infrastructure to improve efficiency and cost-effectiveness.
Experience and Qualifications Required:
• Proven experience as a Data Engineer, Cloud Engineer, or in a similar role with hands-on expertise in Azure.
• Snowflake experience
• Strong capability in Informatica
• Solid experience with data processing frameworks (Databricks/Azure SQL).
• Proficiency in SQL, Python, or other programming languages.
• Strong understanding of ETL processes, data modelling, and optimisation techniques.
• A collaborative mindset and the ability to work with cross-functional teams in a fast-paced environment.
If you feel this is an exciting opportunity for you please apply with your CV.
Job Title: Data Engineer
Rate: DOE (Outside IR35)
Location: Remote
Contract Length: 6 months
A Microsoft-partnered consultancy client of ours have recently secured a project with one of their key clients. They are seeking an Databricks focused Data Engineer to work with them on a a contract basis. This is an exciting opportunity to work on cutting-edge data projects, building scalable data pipelines and cloud-based systems that deliver real impact.
Experience with: Snowflake, Informatica, and Azure is a must.
Key Responsibilities:
• Design, develop, and maintain scalable and high-performance data pipelines on Azure.
• Optimise data storage and retrieval processes to enhance performance and reduce costs.
• Collaborate with data scientists, analysts, and product teams to deliver data-driven solutions that support business objectives.
• Work with both structured and unstructured data to assist with business intelligence, analytics, and machine learning initiatives.
• Ensure data security, governance, and compliance within the cloud environment.
• Troubleshoot and optimise existing cloud-based data infrastructure to improve efficiency and cost-effectiveness.
Experience and Qualifications Required:
• Proven experience as a Data Engineer, Cloud Engineer, or in a similar role with hands-on expertise in Azure.
• Snowflake experience
• Strong capability in Informatica
• Solid experience with data processing frameworks (Databricks/Azure SQL).
• Proficiency in SQL, Python, or other programming languages.
• Strong understanding of ETL processes, data modelling, and optimisation techniques.
• A collaborative mindset and the ability to work with cross-functional teams in a fast-paced environment.
If you feel this is an exciting opportunity for you please apply with your CV.