Insight Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Key skills include Azure Data Factory, Databricks, PySpark, SQL, and AI experience. Proven experience in data engineering and data strategy is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Strategy #Scala #MLflow #REST (Representational State Transfer) #Azure #Data Engineering #Python #Data Pipeline #ADF (Azure Data Factory) #Azure Databricks #PySpark #AI (Artificial Intelligence) #Azure Data Factory #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Data Extraction #API (Application Programming Interface) #SQL (Structured Query Language) #SQL Queries #Strategy #Databricks
Role description
Insight Global are in search for a Data Engineer. The Data Engineer is the one who designs and builds data foundations and end to end solutions for the business to maximize value from data. The role helps create a data-driven thinking within the organization, not just within IT teams, but also in the wider business stakeholder community. The Sr. Data engineer is expected to be a subject matter expert, who design & build data solutions and mentor junior engineers. They are also the key drivers to convert Vison and Data Strategy for IT solutions and deliver. Key Responsibilities: • Design, develop, and maintain scalable data pipelines using Azure Data Factory and Azure Databricks. • Implement data transformation workflows using PySpark and Delta Live Tables (DLT). • Manage and govern data assets using Unity Catalog. • Write efficient and optimized SQL queries for data extraction, transformation, and analysis. • Collaborate with cross-functional teams to understand data requirements and deliver high-quality solutions. • Demonstrate strong ownership and accountability in delivering end-to-end data solutions. • Communicate effectively with stakeholders to gather requirements, provide updates, and manage expectations. The Applicant must have the following: • Proven hands-on experience with Azure Data Factory, Databricks, DLT, and Unity Catalog • Hands on experience with AZURE Function and/or REST/FAST API, LogicApps • Hands on Databricks expertise in: 1. Performance Tunning & Cost Optimization 1. Databricks Structural Streaming 1. Databricks Delta live Table 1. Databricks DBU/compute Performance optimization • AI experience with Databricks MosiacAI and/or Databricks MLFlow, PowerBI, Vector Embedding/Search, LLM Integration, Prompt Engineering • Proficient with Python & PySpark • Strong command of SQL and data modelling concepts