

Channel 4
Data Engineer – Quality Assurance FTC
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer – Quality Assurance on a fixed-term contract until December 2026, based in Leeds/Manchester. Key skills include SQL, UAT, and data engineering. Experience in cloud platforms and automated testing is essential. Pay rate is "Unknown."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Leadership #API (Application Programming Interface) #"ETL (Extract #Transform #Load)" #Security #Data Quality #Automated Testing #Documentation #AWS (Amazon Web Services) #Datasets #Azure #Migration #Version Control #Data Engineering #Cloud #Spark (Apache Spark) #Data Warehouse #SQL (Structured Query Language) #BI (Business Intelligence) #Quality Assurance #Regression #Strategy #Semantic Models #UAT (User Acceptance Testing) #Python #Data Pipeline #Anomaly Detection #Microsoft Power BI #dbt (data build tool) #Monitoring #Data Governance #Scala
Role description
Job Title: Data Engineer – Quality Assurance (Fixed Term Contract to December 2026)
Reports to: Data Enablement Manager
Department: Strategy, Policy & Insights
Location: Leeds / Manchester
Job Grade: P1
DEPARTMENT DESCRIPTION
Audience Insight are trusted thought-leaders bringing expert strategy, policy, market & audience insight – challenging and advising the business to drive delivery of Channel 4’s strategy and remit.
The purpose of the new centralised Audience Insights (Subfunction) is to be independent thought leaders at the heart of Channel 4’s creative and commercial decision-making, bringing a single source of truth and big picture perspective, challenging and advising the business to ensure profitable business growth through the use of data.
Within the new centralised operating Insights model (where all audience and market insight in C4 is centralised into the Audience Insight team), each C-Suite Leader (CEO, Chief Content Officer, Chief Operating Officer, Chief Revenue Officer, Chief Marketing Officer and the Exec Committee as a body) and their Senior Leaders have a dedicated Insight Business Partner (BP) who prioritizes and agrees their Insight requirements, aligned to the strategic C4 pillars. BP Analysts directly source available analysis and where additional, bespoke research or specialised analysis is required, the BP will commission this from the relevant Specialist Lead, managing jointly the presentation back of recommendations to the C-Suite Leader.
JOB PURPOSE
Working within the Reporting & Data Enablement team, the Data Engineer – Quality Assurance ensures the reliability, accuracy and integrity of the data pipelines, curated datasets and system integrations that underpin Channel 4’s reporting, analytics and downstream systems integrations.
The role provides independent validation of data engineering and transformation work during a period of significant platform and systems change. The post holder designs and executes robust quality assurance processes across ingestion pipelines, transformation logic, system integrations and semantic models to ensure continuity of reporting, analytics, operational and systems processes across dual running and migration.
The role validates data across multiple internal and partner systems, ensuring consistency, completeness and accuracy between legacy and new platforms. Working closely with data engineers, report developers and Technology teams, the role embeds structured testing approaches, automated validation and reconciliation frameworks to ensure high confidence in production-ready data products.
The post holder aligns to engineering and governance standards (testing frameworks, documentation, version control, CI/CD) and supports the safe delivery of scalable, reliable and trusted data products across the organisation.
The role acts as an independent quality gate with authority to approve or block pull requests, pipeline promotions and releases where acceptance criteria are not met.
KEY RESPONSIBILITIES
• Design and execute QA processes across ingestion pipelines, transformation logic and curated datasets to ensure reliable, model-ready and governed data for reporting and analytics.
• Establish and embed QA standards, ways of working and expectations within BI, acting as the first point of reference for data quality assurance.
• Implement structured data testing for UAT phase of the client side data collection.
• Work closely with project stakeholders to identify and document functional UAT test scenarios
• Perform functional UAT testing for client side data collection apps and present results in a timely manner.
• Define, document and maintain data acceptance criteria for Streaming Transformation Programme (STP) and BI deliverables, including validation of business rules through end to end data flows, metric logic and KPI calculations.
• Validate data consistency between legacy and new systems during parallel operation, ensuring continuity and confidence in reporting outputs.
• Support development (and ideally automate) reconciliation and validation checks to compare datasets across systems, pipelines and reporting layers.
• Implement structured data testing approaches (schema validation, record reconciliation, anomaly detection, transformation validation and regression testing).
• Ensure production data continues to conform to defined quality and business standards through post release validation, reconciliation and monitoring
• Work closely with data engineers and semantic-layer developers to ensure transformation logic produces consistent, reliable outputs aligned to existing metric and KPI definitions.
• Design and maintain test frameworks covering pipeline logic, data models, integrations and reporting outputs.
• Support Dev → Test → Prod workflows by validating releases, reviewing test coverage and test cycle evidence and confirming that pipeline and model changes behave as expected in production.
• Validate system integrations and data exchanges between platforms, ensuring data contracts, schemas and transformation logic remain consistent.
• Monitor pipelines and reconciliation outputs to proactively identify issues before they impact reporting.
• Document validation processes, reconciliation logic, data quality rules and test coverage to support governance and auditability.
• Collaborate with Technology Data Engineering teams to ensure integration points, pipelines and infrastructure changes are appropriately documented and validated before release.
• Support STP by providing QA assurance across system migrations, platform upgrades and new capability implementations.
• Produce QA artefacts (evidence packs, sign off summaries, risk statements) for STP governance.
• Identify potential risks to data continuity or reporting integrity and escalate proactively with recommended mitigations.
• Share testing approaches, documenting QA frameworks and supporting knowledge sharing across engineering and reporting teams.
Key Relationships & Stakeholders
• Reporting & Data Enablement leadership and Reporting Manager
• Data Team Project Manager and Lead Developer
• Data Engineers and Semantic-layer Developers
• Architecture, Transformation and Business Viewer Readiness Teams (platform readiness, access, pipeline orchestration)
• Programme and delivery teams responsible for systems implementation
• Data Governance / Security Teams (standards, controls, assurance)
ESSENTIAL SKILLS AND EXPERIENCE
• Proven experience in data engineering, web analytics engineering and / or data QA roles delivering reliable and governed datasets for BI and analytics.
• Hands on experience in functional UAT testing for complex UI web based/mobile client-server applications.
• Demonstrable hands on experience in complex function systems testing (e.g. API testing).
• Strong SQL and practical experience validating data across cloud data platforms (e.g., AWS / Azure / Fabric / Adobe / mParticle or equivalent).
• Hands-on experience designing data quality, reconciliation and validation frameworks across pipelines and datasets.
• Experience in validating system integrations and data flows between platforms, including schema validation and transformation verification.
• Practical experience working on large-scale system implementations, platform migrations or transformation programmes.
• Experience supporting parallel system operation or migration environments where data continuity and reconciliation are critical.
• Strong understanding of ELT/ETL patterns, incremental loads, transformation logic and data quality testing.
• Experienced in governance standards (testing frameworks, documentation, version control, CI/CD) and delivery of scalable, reliable and trusted data products
• Experience in implementing automated testing and monitoring approaches within engineering pipelines.
• Evidence of documentation, version control and release/change discipline.
• Strong communication and collaboration with Data and Operational teams; ability to translate technical findings into actionable insights for stakeholders.
DESIRABLE
• Working knowledge of Power BI data model requirements and how upstream pipeline design affects reporting outputs.
• Experience with Python/Spark or equivalent tools for building automated validation and reconciliation checks.
• Familiarity with testing frameworks used in modern data engineering environments (e.g., dbt tests or similar approaches).
• Experience working within fast-paced technology or IT transformation programmes.
• Experience working with system integrations, APIs or data exchange frameworks beyond purely data warehouse pipelines.
• Media, product or streaming industry context helpful but not essential.
Apply via our careers site to view the full job description and submit your application.
Job Title: Data Engineer – Quality Assurance (Fixed Term Contract to December 2026)
Reports to: Data Enablement Manager
Department: Strategy, Policy & Insights
Location: Leeds / Manchester
Job Grade: P1
DEPARTMENT DESCRIPTION
Audience Insight are trusted thought-leaders bringing expert strategy, policy, market & audience insight – challenging and advising the business to drive delivery of Channel 4’s strategy and remit.
The purpose of the new centralised Audience Insights (Subfunction) is to be independent thought leaders at the heart of Channel 4’s creative and commercial decision-making, bringing a single source of truth and big picture perspective, challenging and advising the business to ensure profitable business growth through the use of data.
Within the new centralised operating Insights model (where all audience and market insight in C4 is centralised into the Audience Insight team), each C-Suite Leader (CEO, Chief Content Officer, Chief Operating Officer, Chief Revenue Officer, Chief Marketing Officer and the Exec Committee as a body) and their Senior Leaders have a dedicated Insight Business Partner (BP) who prioritizes and agrees their Insight requirements, aligned to the strategic C4 pillars. BP Analysts directly source available analysis and where additional, bespoke research or specialised analysis is required, the BP will commission this from the relevant Specialist Lead, managing jointly the presentation back of recommendations to the C-Suite Leader.
JOB PURPOSE
Working within the Reporting & Data Enablement team, the Data Engineer – Quality Assurance ensures the reliability, accuracy and integrity of the data pipelines, curated datasets and system integrations that underpin Channel 4’s reporting, analytics and downstream systems integrations.
The role provides independent validation of data engineering and transformation work during a period of significant platform and systems change. The post holder designs and executes robust quality assurance processes across ingestion pipelines, transformation logic, system integrations and semantic models to ensure continuity of reporting, analytics, operational and systems processes across dual running and migration.
The role validates data across multiple internal and partner systems, ensuring consistency, completeness and accuracy between legacy and new platforms. Working closely with data engineers, report developers and Technology teams, the role embeds structured testing approaches, automated validation and reconciliation frameworks to ensure high confidence in production-ready data products.
The post holder aligns to engineering and governance standards (testing frameworks, documentation, version control, CI/CD) and supports the safe delivery of scalable, reliable and trusted data products across the organisation.
The role acts as an independent quality gate with authority to approve or block pull requests, pipeline promotions and releases where acceptance criteria are not met.
KEY RESPONSIBILITIES
• Design and execute QA processes across ingestion pipelines, transformation logic and curated datasets to ensure reliable, model-ready and governed data for reporting and analytics.
• Establish and embed QA standards, ways of working and expectations within BI, acting as the first point of reference for data quality assurance.
• Implement structured data testing for UAT phase of the client side data collection.
• Work closely with project stakeholders to identify and document functional UAT test scenarios
• Perform functional UAT testing for client side data collection apps and present results in a timely manner.
• Define, document and maintain data acceptance criteria for Streaming Transformation Programme (STP) and BI deliverables, including validation of business rules through end to end data flows, metric logic and KPI calculations.
• Validate data consistency between legacy and new systems during parallel operation, ensuring continuity and confidence in reporting outputs.
• Support development (and ideally automate) reconciliation and validation checks to compare datasets across systems, pipelines and reporting layers.
• Implement structured data testing approaches (schema validation, record reconciliation, anomaly detection, transformation validation and regression testing).
• Ensure production data continues to conform to defined quality and business standards through post release validation, reconciliation and monitoring
• Work closely with data engineers and semantic-layer developers to ensure transformation logic produces consistent, reliable outputs aligned to existing metric and KPI definitions.
• Design and maintain test frameworks covering pipeline logic, data models, integrations and reporting outputs.
• Support Dev → Test → Prod workflows by validating releases, reviewing test coverage and test cycle evidence and confirming that pipeline and model changes behave as expected in production.
• Validate system integrations and data exchanges between platforms, ensuring data contracts, schemas and transformation logic remain consistent.
• Monitor pipelines and reconciliation outputs to proactively identify issues before they impact reporting.
• Document validation processes, reconciliation logic, data quality rules and test coverage to support governance and auditability.
• Collaborate with Technology Data Engineering teams to ensure integration points, pipelines and infrastructure changes are appropriately documented and validated before release.
• Support STP by providing QA assurance across system migrations, platform upgrades and new capability implementations.
• Produce QA artefacts (evidence packs, sign off summaries, risk statements) for STP governance.
• Identify potential risks to data continuity or reporting integrity and escalate proactively with recommended mitigations.
• Share testing approaches, documenting QA frameworks and supporting knowledge sharing across engineering and reporting teams.
Key Relationships & Stakeholders
• Reporting & Data Enablement leadership and Reporting Manager
• Data Team Project Manager and Lead Developer
• Data Engineers and Semantic-layer Developers
• Architecture, Transformation and Business Viewer Readiness Teams (platform readiness, access, pipeline orchestration)
• Programme and delivery teams responsible for systems implementation
• Data Governance / Security Teams (standards, controls, assurance)
ESSENTIAL SKILLS AND EXPERIENCE
• Proven experience in data engineering, web analytics engineering and / or data QA roles delivering reliable and governed datasets for BI and analytics.
• Hands on experience in functional UAT testing for complex UI web based/mobile client-server applications.
• Demonstrable hands on experience in complex function systems testing (e.g. API testing).
• Strong SQL and practical experience validating data across cloud data platforms (e.g., AWS / Azure / Fabric / Adobe / mParticle or equivalent).
• Hands-on experience designing data quality, reconciliation and validation frameworks across pipelines and datasets.
• Experience in validating system integrations and data flows between platforms, including schema validation and transformation verification.
• Practical experience working on large-scale system implementations, platform migrations or transformation programmes.
• Experience supporting parallel system operation or migration environments where data continuity and reconciliation are critical.
• Strong understanding of ELT/ETL patterns, incremental loads, transformation logic and data quality testing.
• Experienced in governance standards (testing frameworks, documentation, version control, CI/CD) and delivery of scalable, reliable and trusted data products
• Experience in implementing automated testing and monitoring approaches within engineering pipelines.
• Evidence of documentation, version control and release/change discipline.
• Strong communication and collaboration with Data and Operational teams; ability to translate technical findings into actionable insights for stakeholders.
DESIRABLE
• Working knowledge of Power BI data model requirements and how upstream pipeline design affects reporting outputs.
• Experience with Python/Spark or equivalent tools for building automated validation and reconciliation checks.
• Familiarity with testing frameworks used in modern data engineering environments (e.g., dbt tests or similar approaches).
• Experience working within fast-paced technology or IT transformation programmes.
• Experience working with system integrations, APIs or data exchange frameworks beyond purely data warehouse pipelines.
• Media, product or streaming industry context helpful but not essential.
Apply via our careers site to view the full job description and submit your application.






