Data Engineer (Contract) (contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Contract) with a 12-month duration, remote work in the UK, and occasional travel to Kirkuk, Iraq. Key skills include ETL/ELT processes, cloud platforms (AWS, Azure), and experience in the oil and energy sector. Pay rate is unspecified.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 17, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Sunbury-On-Thames, England, United Kingdom
-
🧠 - Skills detailed
#dbt (data build tool) #BI (Business Intelligence) #Dimensional Modelling #Data Pipeline #Data Engineering #Cloud #Scala #Data Quality #Data Management #"ETL (Extract #Transform #Load)" #REST (Representational State Transfer) #Normalization #Schema Design #Scripting #Azure #Informatica #Data Science #Data Warehouse #Data Lake #GraphQL #Indexing #SQL (Structured Query Language) #Computer Science #GCP (Google Cloud Platform) #NoSQL #Azure Data Factory #Databases #ADF (Azure Data Factory) #JSON (JavaScript Object Notation) #IICS (Informatica Intelligent Cloud Services) #API (Application Programming Interface) #Java #Python #Data Integration #AWS (Amazon Web Services)
Role description
Job Title: Data Engineer Job Location: Remote (UK) - Occasional Travel to Kirkuk Iraq (5-10%) Contract Length: 12 Months (Possibility of Extension) Industry: Oil and Energy, IT Working Hours: 8 per day/ 40 per week Role Overview: The Data Engineer, reports to the Digital Team Data Manager, is responsible for building data integration solutions to satisfy business problems. This is an exciting opportunity for a skilled data professional to play a critical role in delivering modern, robust solutions in an ambitious organisation. You will be instrumental in implementing new technologies, designing and maintaining scalable ETL/ELT processes, and developing and administering cloud based data platforms. This role requires a proactive individual with a passion for data quality, cloud technology and continuous improvement. What you will do: β€’ Design build and maintain high performance data pipelines and API’s β€’ Collaborate cross-functionally to understand business needs and align data strategies accordingly β€’ Assemble and optimise large, complex data sets from diverse sources. β€’ Champion data quality and promote a data-as-products mindset. β€’ Define and document technical solutions. β€’ Apply appropriate rigour and testing to all products delivered. β€’ Support troubleshooting and continuous improvement in all areas of data management. β€’ Foster a strong sense of ownership, urgency and teamwork within the Digital Team. What you will have: β€’ Bachelor’s degree in Data Science, Information Management, Computer Science, or a related discipline. β€’ Experience working in high-risk industries such as oil & gas, energy, or critical infrastructure is a plus. β€’ Attention to detail, consistency, and reliability requires a meticulous approach. β€’ Ability to design efficient and scalable data models for various analytical and operational use cases. Understanding of normalization and denormalization. β€’ Strong ability to analyse complex data challenges, identify root causes, and propose effective solutions. β€’ Proficiency in designing, building, and optimizing Extract, Transform, Load (ETL) or ELT pipelines from databases. β€’ Experience with data pipeline orchestration tools like Informatica IICS, Azure Data Factory, or dbt Cloud. β€’ Experience integrating data from various internal and external APIs (REST, SOAP, GraphQL). β€’ Strong knowledge of data warehouse design principles, dimensional modelling, and experience with data warehousing solutions. β€’ Understanding and experience of Azure or AWS data lake concepts. β€’ Deep understanding of SQL databases, including schema design, indexing, and performance tuning. β€’ Familiarity with NoSQL databases for handling unstructured or semi-structured data. β€’ Strong experience with SQL, Python, JAVA, JSON, Parquet β€’ A curious and creative approach to data challenges. β€’ Strong Team player with demonstrated ability in consensus building to ensure excellent stakeholder engagement, alignment and ethical decision-making. β€’ Excellent verbal and written communication skills to collaborate effectively with data scientists, analysts, business stakeholders, and other engineers. β€’ Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences β€’ Essential Fluency in English, both written and spoken. β€’ Desirable proficiency in Arabic and/or Kurdish. Skills: β€’ Critical thinking β€’ Enterprise data management β€’ Business intelligence and analytics β€’ Cloud data platforms (AWS, Azure, GCP) β€’ Data Modelling β€’ Scripting and querying (Python, SQL) β€’ Pipeline development β€’ API Integration β€’ Strong stakeholder and communication skills β€’ Flexible approach We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.