

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in London with a contract length of over 6 months and a pay rate of £40,000-£44,000. Key skills required include hands-on experience with Redshift, Python, SQL, and familiarity with ETL workflows and data quality testing.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
200
-
🗓️ - Date discovered
September 19, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Datasets #Amazon Redshift #Airflow #Automation #Dimensional Modelling #Schema Design #AWS (Amazon Web Services) #Python #SQL (Structured Query Language) #Data Manipulation #Scala #Data Warehouse #GCP (Google Cloud Platform) #Data Visualisation #Redshift #Cloud #Data Science #Data Pipeline #Looker #Computer Science #GitHub #"ETL (Extract #Transform #Load)" #Agile #Storytelling #Deployment #Metadata #Azure #Programming #Data Engineering #Scripting #Databases #Data Quality #Tableau
Role description
JOB DETAILS
JOB BAND: C
CONTRACT TYPE: Full-time / Fixed-term
DEPARTMENT: BBC Chief Customer Officer Group
LOCATION: London - London Broadcasting House
PROPOSED SALARY RANGE: £40,000-£44,000
We're happy to discuss flexible working. If you'd like to, please indicate your preference in the application – though there's no obligation to do so now. Flexible working will be part of the discussion at offer stage.
PURPOSE OF THE ROLE
The BBC is reinventing itself for a new generation—delivering world-class creativity, global reach, and public value. The World Service plays a vital role in this mission, supporting international journalism across over 40 languages and markets. Our team is adding to a scalable data warehouse to unify diverse data sources—from audience analytics to editorial metadata—and enable smarter, data-driven decisions across the organisation.
As a Data Engineer, you’ll be part of a multi-disciplinary agile team working alongside analysts and data scientists, in a team with audience researchers. You’ll help design and build data pipelines, contribute to infrastructure, and support the delivery of high-quality, resilient data systems that empower global storytelling.
WHY JOIN THE TEAM
Join our team which supports global journalism across more than 40 languages. You'll work with diverse datasets that reflect the complexity of the real world, while growing your skills in a supportive, international environment. Together, we build infrastructure that empowers journalists and informs millions worldwide.
Your Key Responsibilities And Impact
• Build and maintain scalable ETL pipelines using Python and SQL.
• Work with Amazon Redshift to design and optimise our data warehouse.
• Translate stakeholder requirements into technical solutions.
• Test and validate data workflows to ensure quality and reliability.
• Collaborate across disciplines to create value with data.
• Contribute to a culture of learning, adaptability, and best practices.
Your Skills And Experience
• Hands-on experience with Redshift, Python, and SQL.
• Educational background in Computer Science, Data Engineering, or a related field.
• Exposure to data warehouse projects, including schema design and data modelling.
• Familiarity with ETL/ELT workflows and data quality testing.
• Strong team collaboration and a proactive learning mindset.
Essential Criteria
• Hands-on Experience with Redshift, Python & SQL
• Candidate has practical experience working with Amazon Redshift for data warehousing tasks.
• Comfortable using Python for scripting and automation, with solid SQL skills for querying and data manipulation.
• Educational Background in a Relevant Field
• Holds a degree (or equivalent training) in Computer Science, Data Engineering, Information Systems, or a related discipline.
• Demonstrates foundational knowledge in data structures, databases, and programming.
• Exposure to Data Warehouse Projects
• Has contributed to the design, implementation, or maintenance of a data warehouse.
• Understands concepts like dimensional modelling, schema design, and data partitioning.
• Experience with ETL Pipelines and Data Quality Testing
• Familiar with building or maintaining ETL/ELT workflows using tools or custom scripts.
• Has tested data pipelines for accuracy, completeness, and performance, and understands the importance of data validation.
• Strong Team Collaboration and Learning Mindset
• Demonstrates eagerness to learn new tools and technologies.
• Communicates well, takes feedback constructively, and thrives in collaborative environments.
Desired But Not Required
• Media, or broadcast experience.
• Experience with Airflow, or other orchestration tools.
• Familiarity with cloud platforms (AWS, GCP, Azure).
• Knowledge of data visualisation tools (e.g., Tableau, Looker).
• Contributions to open-source projects or a personal GitHub portfolio.
If you can bring some of these skills and experience, along with transferable strengths, we’d love to hear from you and encourage you to apply.
Disclaimer
This job description is a written statement of the essential characteristics of the job, with its principal accountabilities, incorporating a note of the skills, knowledge and experience required for a satisfactory level of performance. This is not intended to be a complete, detailed account of all aspects of the duties involved.
Please note: If you were to be offered this role, the BBC will conduct Employment screening checks which include Reference checks; Eligibility to work checks; and if applicable to the role, Safeguarding and Adverse media/Social media checks. Any offer made is conditional on these checks being satisfactory.
For any general queries, please contact: bbchr@bbc.co.uk
Redeployment
The BBC is committed to redeploying employees seeking suitable alternative employment within the BBC and they will be given priority consideration ahead of other applicants. Priority consideration means for those employees seeking redeployment their application will be considered alongside anyone else at risk of redundancy, prior to any individuals being considered who are not at risk.
JOB DETAILS
JOB BAND: C
CONTRACT TYPE: Full-time / Fixed-term
DEPARTMENT: BBC Chief Customer Officer Group
LOCATION: London - London Broadcasting House
PROPOSED SALARY RANGE: £40,000-£44,000
We're happy to discuss flexible working. If you'd like to, please indicate your preference in the application – though there's no obligation to do so now. Flexible working will be part of the discussion at offer stage.
PURPOSE OF THE ROLE
The BBC is reinventing itself for a new generation—delivering world-class creativity, global reach, and public value. The World Service plays a vital role in this mission, supporting international journalism across over 40 languages and markets. Our team is adding to a scalable data warehouse to unify diverse data sources—from audience analytics to editorial metadata—and enable smarter, data-driven decisions across the organisation.
As a Data Engineer, you’ll be part of a multi-disciplinary agile team working alongside analysts and data scientists, in a team with audience researchers. You’ll help design and build data pipelines, contribute to infrastructure, and support the delivery of high-quality, resilient data systems that empower global storytelling.
WHY JOIN THE TEAM
Join our team which supports global journalism across more than 40 languages. You'll work with diverse datasets that reflect the complexity of the real world, while growing your skills in a supportive, international environment. Together, we build infrastructure that empowers journalists and informs millions worldwide.
Your Key Responsibilities And Impact
• Build and maintain scalable ETL pipelines using Python and SQL.
• Work with Amazon Redshift to design and optimise our data warehouse.
• Translate stakeholder requirements into technical solutions.
• Test and validate data workflows to ensure quality and reliability.
• Collaborate across disciplines to create value with data.
• Contribute to a culture of learning, adaptability, and best practices.
Your Skills And Experience
• Hands-on experience with Redshift, Python, and SQL.
• Educational background in Computer Science, Data Engineering, or a related field.
• Exposure to data warehouse projects, including schema design and data modelling.
• Familiarity with ETL/ELT workflows and data quality testing.
• Strong team collaboration and a proactive learning mindset.
Essential Criteria
• Hands-on Experience with Redshift, Python & SQL
• Candidate has practical experience working with Amazon Redshift for data warehousing tasks.
• Comfortable using Python for scripting and automation, with solid SQL skills for querying and data manipulation.
• Educational Background in a Relevant Field
• Holds a degree (or equivalent training) in Computer Science, Data Engineering, Information Systems, or a related discipline.
• Demonstrates foundational knowledge in data structures, databases, and programming.
• Exposure to Data Warehouse Projects
• Has contributed to the design, implementation, or maintenance of a data warehouse.
• Understands concepts like dimensional modelling, schema design, and data partitioning.
• Experience with ETL Pipelines and Data Quality Testing
• Familiar with building or maintaining ETL/ELT workflows using tools or custom scripts.
• Has tested data pipelines for accuracy, completeness, and performance, and understands the importance of data validation.
• Strong Team Collaboration and Learning Mindset
• Demonstrates eagerness to learn new tools and technologies.
• Communicates well, takes feedback constructively, and thrives in collaborative environments.
Desired But Not Required
• Media, or broadcast experience.
• Experience with Airflow, or other orchestration tools.
• Familiarity with cloud platforms (AWS, GCP, Azure).
• Knowledge of data visualisation tools (e.g., Tableau, Looker).
• Contributions to open-source projects or a personal GitHub portfolio.
If you can bring some of these skills and experience, along with transferable strengths, we’d love to hear from you and encourage you to apply.
Disclaimer
This job description is a written statement of the essential characteristics of the job, with its principal accountabilities, incorporating a note of the skills, knowledge and experience required for a satisfactory level of performance. This is not intended to be a complete, detailed account of all aspects of the duties involved.
Please note: If you were to be offered this role, the BBC will conduct Employment screening checks which include Reference checks; Eligibility to work checks; and if applicable to the role, Safeguarding and Adverse media/Social media checks. Any offer made is conditional on these checks being satisfactory.
For any general queries, please contact: bbchr@bbc.co.uk
Redeployment
The BBC is committed to redeploying employees seeking suitable alternative employment within the BBC and they will be given priority consideration ahead of other applicants. Priority consideration means for those employees seeking redeployment their application will be considered alongside anyone else at risk of redundancy, prior to any individuals being considered who are not at risk.