

Xcede
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer for a 6-month contract, hybrid with remote work and occasional EU travel. Key skills include SQL, Python, cloud expertise (AWS, Azure, GCP), and experience with data pipelines and modern data warehouses.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 2, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#NoSQL #SAP #Monitoring #Business Objects #Scala #Data Warehouse #DevOps #Python #Redshift #SQL (Structured Query Language) #Storage #Docker #Data Architecture #Data Engineering #AWS (Amazon Web Services) #Oracle #Airflow #Data Ingestion #Azure #Automation #dbt (data build tool) #Spark (Apache Spark) #BigQuery #Cloud #Kafka (Apache Kafka) #Java #Security #BI (Business Intelligence) #BO (Business Objects) #Data Quality #Agile #Migration #GCP (Google Cloud Platform) #Microsoft Power BI #Snowflake #"ETL (Extract #Transform #Load)" #Code Reviews #Kubernetes #Infrastructure as Code (IaC)
Role description
Data Engineer
6-month contract
Hybrid
Remote with some ad-hoc travel EU
We are seeking a Data Engineer to support our Core Banking Unit within the Engineering Core Platform Department. This role will focus on designing, building, and optimising data solutions that drive regulatory, operational, and analytical capabilities across multiple banking areas.
As a contractor, you will play a key part in modernising legacy systems, enabling migration to a unified data platform, and ensuring high standards of quality, security, and performance. You will work within a cross-functional Agile team in a international environment, collaborating with analysts, engineers, and stakeholders to deliver impactful data solutions.
Key Responsibilities
• Design and implement systems for data ingestion, processing, storage, and sharing
• Build scalable, reliable, high-performance data architectures
• Develop and maintain ETL/ELT pipelines for regulatory, operational, and analytical needs
• Modernise legacy systems and support platform migration initiatives
• Maintain high data quality, security, availability, and performance standards
• Perform code reviews, troubleshoot, and resolve defects
• Implement monitoring and alerting for data workflows
• Collaborate across teams to deliver business-driven solutions
Required Skills & Experience
• Strong hands-on experience with SQL and Python (Java is a plus)
• Cloud expertise (AWS, Azure, or GCP)
• Container orchestration skills (Kubernetes, Docker)
• Knowledge of streaming platforms (Kafka, Spark, Flink)
• Familiarity with pipeline/orchestration tools (Airflow, dbt)
• Experience with modern data warehouses (Snowflake, BigQuery, Redshift)
• Working knowledge of enterprise platforms (Oracle DB, SAP IQ, SAP Data Services, SAP Business Objects) and Power BI
• Experience with integration patterns (real-time, event-driven, bridging legacy to modern platforms)
• SQL and NoSQL database expertise
• Exposure to DevOps practices (infrastructure as code, test automation, security best practice)
• Fluent English (spoken and written)
Data Engineer
6-month contract
Hybrid
Remote with some ad-hoc travel EU
We are seeking a Data Engineer to support our Core Banking Unit within the Engineering Core Platform Department. This role will focus on designing, building, and optimising data solutions that drive regulatory, operational, and analytical capabilities across multiple banking areas.
As a contractor, you will play a key part in modernising legacy systems, enabling migration to a unified data platform, and ensuring high standards of quality, security, and performance. You will work within a cross-functional Agile team in a international environment, collaborating with analysts, engineers, and stakeholders to deliver impactful data solutions.
Key Responsibilities
• Design and implement systems for data ingestion, processing, storage, and sharing
• Build scalable, reliable, high-performance data architectures
• Develop and maintain ETL/ELT pipelines for regulatory, operational, and analytical needs
• Modernise legacy systems and support platform migration initiatives
• Maintain high data quality, security, availability, and performance standards
• Perform code reviews, troubleshoot, and resolve defects
• Implement monitoring and alerting for data workflows
• Collaborate across teams to deliver business-driven solutions
Required Skills & Experience
• Strong hands-on experience with SQL and Python (Java is a plus)
• Cloud expertise (AWS, Azure, or GCP)
• Container orchestration skills (Kubernetes, Docker)
• Knowledge of streaming platforms (Kafka, Spark, Flink)
• Familiarity with pipeline/orchestration tools (Airflow, dbt)
• Experience with modern data warehouses (Snowflake, BigQuery, Redshift)
• Working knowledge of enterprise platforms (Oracle DB, SAP IQ, SAP Data Services, SAP Business Objects) and Power BI
• Experience with integration patterns (real-time, event-driven, bridging legacy to modern platforms)
• SQL and NoSQL database expertise
• Exposure to DevOps practices (infrastructure as code, test automation, security best practice)
• Fluent English (spoken and written)