Infoplus Technologies UK Limited

Azure Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer in Warwick, UK, lasting 6+ months at £300-£350 Inside IR35. Key skills include ETL, Power BI, Python, SQL, and experience with Snowflake. A degree and Microsoft Certified: Azure Data Engineer Associate are essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
350
-
🗓️ - Date
October 24, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Warwick, England, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #BI (Business Intelligence) #dbt (data build tool) #Data Management #API (Application Programming Interface) #Data Governance #Big Data #Synapse #Azure #Microsoft Azure #Trino #Data Science #Data Lineage #Microsoft Power BI #Databricks #Snowflake #Scala #Python #Data Quality #Data Access #Data Lake #Datasets #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Jira #Azure Data Factory #Agile #DAX #GIT #ADF (Azure Data Factory) #Data Mining #Metadata #Data Pipeline
Role description
Role - Azure Data Engineer Location - Warwick, UK Duration - 06+ months with maximum possibility of extension Work Type - Hybrid role (weekly 2 days in office) Rate - £300 to £350 Inside IR35 Job Summary; To provide analytical expertise across all areas of ET in the manipulation, interpretation, and management of data. The role focuses on designing and implementing architectural and data model changes across core applications and strategic data platforms to support business outcomes. A key purpose of the role is to design, build, and maintain high-quality data products that are discoverable, reusable, and aligned with domain needs—enabling decentralised data ownership and supporting a data mesh approach. The role also involves profiling data to implement quality rules that enable visibility into data health and drive improvements to support business insight and performance metrics. Key Accountabilities: • Develop and maintain ETL pipelines and data models for analytics and reporting platforms. • Design, build, and maintain data products that serve specific business domains and use cases. • Conduct advanced data mining and manipulation, advising on architecture and data model changes. • Design and deliver Power BI dashboards and reports that provide actionable insights to business stakeholders. • Set strategic roadmaps for information quality and ensure critical data dependencies are recognised and governed. • Collaborate with business users to catalogue data lineage and develop data quality dashboards using modern tooling. • Ensure data consistency through cleaning, transformation, and validation processes. • Support data scientists and analysts by delivering robust data solutions aligned with business objectives. • Build and maintain APIs to enable secure, scalable data exchange across systems. • Develop and optimise data pipelines using Snowflake and other big data tools such as Databricks and Trino. • Contribute to the implementation of data mesh principles and federated data governance. • Leverage Data Fabric platforms such as Promethium to unify data access, discovery, and governance. • Integrate and orchestrate data flows using MuleSoft and SnapLogic. Required Experience: • Strong experience in data mining, profiling, and manipulation of large datasets. • Proficiency in writing ETL code and implementing data quality rules. • Advanced Power BI skills, including DAX, Power Query (M), data modelling, and report design. • Skilled in Python, SQL, Azure Data Factory, Git, and DBT. • Experience with Snowflake and big data tooling (e.g. Databricks, Trino). • Strong experience in API development and testing. • Proficient in deploying and managing data infrastructure in Microsoft Azure (e.g. Azure Data Lake, Azure Synapse, Azure Functions). • Familiarity with data mesh concepts such as data as a product, domain ownership, and federated governance. • Demonstrated experience in data governance, including stewardship, policy implementation, and metadata management. • Experience working with Data Fabric platforms (e.g. Promethium) and integration tools such as MuleSoft and SnapLogic. • Knowledge of JIRA and agile delivery practices. • Effective communicator with the ability to translate complex data issues for diverse audiences. Essential Requirement: 1. Essential: Degree or equivalent in a quantitative or analytical discipline 1. Microsoft Certified: Azure Data Engineer Associate (DP - 203)