E-Solutions

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "X months" at a pay rate of "$X/hour" located in "Location". Key skills include MySQL, advanced SQL, Python scripting, and experience with geospatial datasets and public datasets, particularly ONS data.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Datasets #DBA (Database Administrator) #Scripting #Statistics #Data Quality #Azure Data Factory #Data Ingestion #Data Processing #"ETL (Extract #Transform #Load)" #Libraries #ADF (Azure Data Factory) #Data Profiling #Spark (Apache Spark) #SQL (Structured Query Language) #Pandas #Spatial Data #Python #Matplotlib #Automation #Anomaly Detection #Azure #Data Engineering #MySQL #Data Analysis
Role description
Key Responsibilities • Design and implement data ingestion, import/export, and transformation pipelines. • Handle data extracts, schema discovery, incremental loads, and multi-source integrations. • Build data transformation pipelines including profiling, cleansing, standardization, conformance, and publishing. • Develop rules for data completeness, validity, and consistency, including exception handling. • Perform advanced SQL-based data profiling, joins/merges, deduplication, anomaly detection, and performance tuning. • Automate data processes using Python scripting and data processing frameworks. • Work with geospatial datasets and spatial analysis workflows to support analytical outputs. • Ensure secure, auditable, and reproducible data outputs for analytics teams. Required Skills & Experience • Strong experience as a Data Engineer / Logical DBA with MySQL expertise. • Advanced SQL skills for data analysis, transformation, and performance tuning. • Strong Python scripting experience for automation, rules engines, and data quality checks. • Experience with Python libraries such as Pandas, Polars, Scikit-learn, and Matplotlib. • Experience with modern data tools such as Spark, Azure Data Factory, or similar platforms. • Proven experience working with geospatial data (GeoJSON, shapefiles, vector/raster formats). • Understanding of coordinate reference systems and spatial data processing. • Experience working with public datasets, especially UK Office for National Statistics (ONS) data. • Ability to interpret geographic data and translate local insights into regional or national datasets.