

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month hybrid contract in London, UK, offering competitive pay. Key skills include data engineering, ETL processes, SQL, Python, and experience in cloud environments, particularly AWS and Snowflake.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Inside IR35
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Lineage #Data Management #Cloud #GraphQL #Version Control #dbt (data build tool) #AI (Artificial Intelligence) #Data Architecture #Snowflake #Data Governance #SQL (Structured Query Language) #Scala #Python #AWS (Amazon Web Services) #Data Quality #Metadata #Monitoring #Agile #"ETL (Extract #Transform #Load)" #DataOps #Data Engineering #Data Pipeline #Data Warehouse #Data Manipulation #ML (Machine Learning) #DevOps #Code Reviews #Teradata #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job title: Senior Data Engineer
Will the role be 100% remote, hybrid or 100% office? Hybrid β 3 days/week
If the role is hybrid/ office based specify location: London, UK
Duration of assignment: 6 months contract InsideIR35
Role description:
Join a high-impact data transformation programmed within the aviation sector, where a global leading airline is undergoing a major data modernization journey. The initiative goes far beyond a basic lift and shift; itβs a forward-looking transformation that blends the stability of mature legacy systems with the innovation of cloud-first, AI-driven architecture.
As a senior data engineer, youβll play a critical role in building and optimizing modern, scalable data solutions that enable smarter decision-making, richer customer experiences and operational excellence. Youβll be a part of a highly collaborative network of teams working with cutting edge cloud-based technologies whilst also navigating complex legacy βon-premisesβ environments.
This role offers engineers the opportunity to leave behind traditional approaches and contribute to a programmed with long-term impact at the forefront of aviation data innovation.
Key responsibilities:
Design, build and maintain robust data pipelines that support critical business applications and analytics
Analyze, re-engineer and modernize existing ETL processes from legacy systems into scalable cloud-native solutions
Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow
Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code
Participate in code reviews, ensuring adherence to best practices and high engineering standards
Investigate data quality issues and implement monitoring and alterting systems to ensure pipeline reliability
Document workflows, data lineage and technical designs to support maintainability and knowledge sharing
Champion a culture of continuous improvement, experimentation and technical excellence within the team
Key skills/knowledge/experience:
Strong hands-on experience with data engineering in both on-prem and cloud-based environments
Good working knowledge and hands on experience working with Teradata and Infomatica.
Proficiency in working with legacy systems and traditional ETL workflows
Solid experience building data pipelines using modern tools (Airflow, DBT, Glue etc.) and working with large volumes of structures and semi-structured data
Demonstrated experience with SQL and Python for data manipulation, pipeline development and workflow orchestration
Strong grasp of data modelling, data warehousing concepts and performance optimization techniques
Hands-on exposure to cloud platforms, especially AWS
Experience working in agile teams and using version control and CI/CD practices
Desirable skills/knowledge/experience:
Experience with Snowflake or other cloud-native data warehouse technologies
Familiarity with GraphQL and its use in data-driven APIs
Exposure to data governance, data quality and metadata management tools
Interest or experience in applying machine learning / AI pipelines or features within a data engineering context
Understanding of DevOps conceps as applied to data (DataOps) and infrastructure-as-code tools like Teraform or Cloudformation
Previous experience in highly regulated industries or large-scale, enterprise-grade environments