Synergize Consulting

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with SC Clearance, offering a 6-month contract at up to £465 p/d. It requires AWS expertise, Python programming, strong SQL skills, and experience with data lakes and ETL processes in a hybrid remote setting.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
465
-
🗓️ - Date
April 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Business Analysis #Cloud #Redshift #Data Pipeline #Agile #Python #Data Storage #Programming #Informatica #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Data Warehouse #Databases #DevOps #Data Engineering #Leadership #AWS (Amazon Web Services) #Storage #Migration #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Scala #Data Lake #Oracle
Role description
AWS Data Engineer (Cloud transformation) 6 months contract Hybrid remote working – (occasional on-site workshops in either Telford or Sussex) SC Clearance required Up to £465 p/d This is an opportunity to join a high-impact data engineering team delivering large-scale cloud transformation projects that drive real business outcomes. This role sits within a dynamic data and analytics delivery function focused on building modern, scalable solutions to support revenue growth, fraud reduction, and cloud migration initiatives. You’ll play a key role in migrating data from legacy on-premises platforms (including Oracle and Informatica) into a modern, cloud-native AWS architecture. Working as part of an Agile delivery team, you’ll collaborate with engineers, architects, business analysts, and project managers to design and deliver robust, production-grade data solutions Key skills • Proven experience as a Data Engineer with core AWS services (Glue, Lambda, S3 and Redshift) • Solid programming skills in Python • Strong SQL and understanding of data storage technologies (data warehouses, relational databases) • Experience working with AWS-based data lakes (S3) handling structured and unstructured data • Knowledge of open table formats (Iceberg or Delta) Key responsibilities • Contributing to cloud transformation initiatives, supporting technical leadership and mentoring junior engineers • Designing, developing, and testing scalable data pipelines for ingestion, processing, and transformation • Building and optimising ETL/ELT workflows to move data into data lakes, warehouses, and Lakehouse environments • Leveraging AWS and open-source technologies to deliver efficient, reliable data platforms • Embedding DevOps practices, including CI/CD, into data engineering workflows • Collaborating with stakeholders to ensure solutions meet business and technical requirements