

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 12-month contract in London (hybrid). Key skills include GCP, BigQuery, Python/Java, SQL, and ETL pipeline development. Financial services experience and knowledge of Agile frameworks are preferred. Competitive day rate offered.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Security #Data Integration #Infrastructure as Code (IaC) #Scala #Programming #"ETL (Extract #Transform #Load)" #Automation #Data Quality #Python #Cloud #Data Processing #Agile #Data Pipeline #GCP (Google Cloud Platform) #Terraform #SQL (Structured Query Language) #Data Engineering #Compliance #Dataflow #Jira #BigQuery #Storage #Big Data #Data Governance #Java
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Engineering Manager (x2)
London β Hybrid (2 days/week, usually Tues & Fri)
12-month contract
Competitive day rate
ASAP start
About the Role
Join a critical data remediation programme within the Surveillance Team/Lab, part of the Markets Platform area of a major financial services business.
You will lead design and delivery of stable, scalable, and performant data solutions within a complex GCP architecture, driving simplification and improving pipeline automation to enable timely, secure data delivery to surveillance systems.
Key Responsibilities
β’ Engineer stable, scalable, performant, accessible, testable, and secure data products aligned with endorsed technologies and best practices
β’ Lead the design, build, and optimisation of GCP-based data pipelines for critical surveillance and remediation projects
β’ Apply common build patterns to minimise technical debt and adhere to group policies and frameworks for build and release
β’ Participate actively in design and implementation reviews, Agile ceremonies, and planning to set clear goals and priorities
β’ Identify opportunities to automate repetitive tasks and promote reuse of components
β’ Engage with technical communities to share knowledge and advance shared capabilities
β’ Lead incident root-cause analysis and promote active application custodianship to drive continuous improvements
β’ Invest in your ongoing development of technical and Agile skills
β’ Collaborate closely with architecture, security, and compliance teams to ensure solutions meet regulatory requirements
β’ Manage and enhance CI/CD automated pipelines supporting data engineering workflows
β’ Support potential transition planning to vendor surveillance products
Core Skills & Experience
β’ Proven hands-on experience with Google Cloud Platform (GCP) and components such as Dataflow, Pub/Sub, Cloud Storage
β’ Deep expertise in BigQuery and Big Data processing
β’ Strong programming skills in Python or Java used extensively in surveillance data integration
β’ Solid SQL expertise for querying, transforming, and troubleshooting data
β’ Experience building robust ETL pipelines from diverse source systems
β’ Familiarity with CI/CD pipeline automation and enhancement
β’ Understanding of Agile delivery frameworks, including Jira and sprint planning
β’ Knowledge of Terraform for infrastructure as code provisioning
β’ Experience with containerization technologies to deploy and manage environments
β’ Ability to simplify complex architectures and communicate effectively across teams
β’ Strong stakeholder management and influencing skills
Nice to Have
β’ Experience in financial services or surveillance system data engineering
β’ Understanding of data governance and data quality best practices
β’ Exposure to vendor-based surveillance solutions
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyoneβs chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.