

Data Engineer - Maternity Cover
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Maternity Cover on an 18-month contract, with a pay rate of "unknown." It requires strong ETL/ELT skills, experience in Azure Data Factory, SQL, and Power BI. Occasional travel to London or Portsmouth is required.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 6, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
United Kingdom
-
π§ - Skills detailed
#GDPR (General Data Protection Regulation) #ADF (Azure Data Factory) #Data Security #SQL (Structured Query Language) #Security #Data Quality #SSIS (SQL Server Integration Services) #Database Management #Data Integration #Data Cleansing #Data Warehouse #Scala #Computer Science #"ETL (Extract #Transform #Load)" #Data Engineering #Logic Apps #Azure #Data Storage #Power Automate #Synapse #Data Lake #Data Architecture #Microsoft Power BI #Databases #Data Science #Storage #BI (Business Intelligence) #Data Management #Compliance #Azure Data Factory #Documentation #Data Exploration #DevOps #Dataflow #Data Pipeline #Data Access #Datasets #Version Control
Role description
We're hiring a Data Engineer (Maternity Cover) to join our team. This is an 18 month fixed term contract and you will work closely with the IT data team, and other stakeholders across the business to ensure seamless data accessibility and usage across the organisation.
This role would require occasional travel to London or Portsmouth, typically one per month.
The Data Engineer is responsible for data integration and overall data cleansing from a wide range of diverse sources, data exploration and visualisation, statistical analysis, data management and version control. Building scalable data architectures and ensuring the reliability of data flows and management information.
This role covers all data consumed internally across Bell. You will design and implement data flows to connect production and analytical systems. Create solution and data-flow diagrams, as well as documentation to support governance, maintenance, and usage by the organisation. Ensure adherence to change and release management processes.
Responsibilities
As a Data Engineer, you will work closely with the IT data team, and other stakeholders across the business to ensure seamless data accessibility and usage across the organisation.
β’ Design, develop, and maintain scalable, robust data pipelines for ingesting, transforming, and loading large datasets from multiple data sources (e.g., databases, APIs, and third-party services).
β’ Build and optimize data platforms to support the analytical needs of the business, ensuring high availability and reliability of data systems.
β’ Collaborate with IT, and system users to understand business requirements and ensure data accessibility and integrity.
β’ Manage and maintain ETL/ELT processes for structured and unstructured data, ensuring data quality and accuracy.
β’ Develop and implement data storage solutions, including data lakes, data warehouses, and other relevant storage systems, to ensure efficient data access and retrieval.
β’ Ensure data security and compliance with industry standards and regulations, such as GDPR.
β’ Monitor and troubleshoot data issues to detect and resolve data-related issues and optimize performance.
β’ Document data processes and workflows for both technical and non-technical audiences.
β’ Continuously improve and refine data engineering practices by adopting new tools, techniques, and best practices.
Qualifications
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, Data Science, Engineering, or a related field.
β’ Experience in data engineering, database management, or software engineering.
β’ Strong experience with ETL/ELT tools, Synapse Analytics, Azure Data Factory or SSIS.
β’ Proficiency with data insights and visualisation using Power BI
β’ Experience with SQL and database management.
β’ Experience working with APIs and data integration across platforms.
β’ Knowledge of data modelling techniques and best practices.
β’ Familiarity with DevOps.
β’ Knowledge and experience of Power Automate or Logic Apps is desirable.
β’ Desirable certifications, DP-203 Azure Data Engineer, DP-203 Azure Data Engineer, AZ-900 Azure and DP-900 Azure Data Fundamentals.
β’ Strong problem-solving skills with attention to detail and the ability to troubleshoot complex data issues.
β’ Excellent communication and collaboration skills, able to work across teams to deliver solutions that meet business needs.
Protecting your privacy and the security of your data is a longstanding top priority for Bell Integration. Please consult our Privacy Notice (click here) to know more about how we collect, use and transfer the personal data of our candidates.
We're hiring a Data Engineer (Maternity Cover) to join our team. This is an 18 month fixed term contract and you will work closely with the IT data team, and other stakeholders across the business to ensure seamless data accessibility and usage across the organisation.
This role would require occasional travel to London or Portsmouth, typically one per month.
The Data Engineer is responsible for data integration and overall data cleansing from a wide range of diverse sources, data exploration and visualisation, statistical analysis, data management and version control. Building scalable data architectures and ensuring the reliability of data flows and management information.
This role covers all data consumed internally across Bell. You will design and implement data flows to connect production and analytical systems. Create solution and data-flow diagrams, as well as documentation to support governance, maintenance, and usage by the organisation. Ensure adherence to change and release management processes.
Responsibilities
As a Data Engineer, you will work closely with the IT data team, and other stakeholders across the business to ensure seamless data accessibility and usage across the organisation.
β’ Design, develop, and maintain scalable, robust data pipelines for ingesting, transforming, and loading large datasets from multiple data sources (e.g., databases, APIs, and third-party services).
β’ Build and optimize data platforms to support the analytical needs of the business, ensuring high availability and reliability of data systems.
β’ Collaborate with IT, and system users to understand business requirements and ensure data accessibility and integrity.
β’ Manage and maintain ETL/ELT processes for structured and unstructured data, ensuring data quality and accuracy.
β’ Develop and implement data storage solutions, including data lakes, data warehouses, and other relevant storage systems, to ensure efficient data access and retrieval.
β’ Ensure data security and compliance with industry standards and regulations, such as GDPR.
β’ Monitor and troubleshoot data issues to detect and resolve data-related issues and optimize performance.
β’ Document data processes and workflows for both technical and non-technical audiences.
β’ Continuously improve and refine data engineering practices by adopting new tools, techniques, and best practices.
Qualifications
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, Data Science, Engineering, or a related field.
β’ Experience in data engineering, database management, or software engineering.
β’ Strong experience with ETL/ELT tools, Synapse Analytics, Azure Data Factory or SSIS.
β’ Proficiency with data insights and visualisation using Power BI
β’ Experience with SQL and database management.
β’ Experience working with APIs and data integration across platforms.
β’ Knowledge of data modelling techniques and best practices.
β’ Familiarity with DevOps.
β’ Knowledge and experience of Power Automate or Logic Apps is desirable.
β’ Desirable certifications, DP-203 Azure Data Engineer, DP-203 Azure Data Engineer, AZ-900 Azure and DP-900 Azure Data Fundamentals.
β’ Strong problem-solving skills with attention to detail and the ability to troubleshoot complex data issues.
β’ Excellent communication and collaboration skills, able to work across teams to deliver solutions that meet business needs.
Protecting your privacy and the security of your data is a longstanding top priority for Bell Integration. Please consult our Privacy Notice (click here) to know more about how we collect, use and transfer the personal data of our candidates.