Harnham

AWS Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Engineer on a 6-month contract, paying £500-£550 per day, fully remote. Key skills include AWS services, Python, SQL, DBT, and Airflow. Experience with data ingestion and third-party API integration is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
February 12, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#AWS Lambda #Python #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #R #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Observability #Cloud #API (Application Programming Interface) #Agile #dbt (data build tool) #Data Engineering #Data Ingestion #Scala #Programming #Lambda (AWS Lambda) #Monitoring #Airflow #CRM (Customer Relationship Management)
Role description
AWS DATA ENGINEER 6-MONTH CONTRACT £500-£550 PER DAY OUTSIDE IR35 FULLY REMOTE This role offers a great opportunity for an AWS Data Engineer to join a fast-paced media company working on a cutting-edge data ingestion and transformation project. You'll take ownership of end-to-end pipeline development, integrating real-time data streams and third-party APIs into a modern analytics stack. The environment promotes autonomy, modern tooling, and close collaboration across technical and business teams. THE COMPANY This is a leading media organisation leveraging data to drive audience insights, engagement strategies, and targeted content delivery. They are investing heavily in their data platform and tooling, with an emphasis on real-time decision-making and scalable infrastructure. You'll be joining a team focused on delivering data into key business systems including CRM and analytics layers. THE ROLE You'll be part of a cross-functional data engineering team responsible for ingesting, transforming, and delivering data to critical internal systems. This includes designing scalable AWS-based pipelines, integrating external APIs, and orchestrating transformations using DBT and Airflow. You'll also support the transition of R-based data streams into more maintainable Python workflows. Your responsibilities will include: • Building and maintaining ingestion pipelines using AWS Lambda, API Gateway, and Kinesis. • Integrating third-party APIs into the data platform and transforming data for CRM delivery. • Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. • Ensuring observability and reliability using CloudWatch and automated monitoring. • Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND REQUIREMENTS • Proven experience with AWS services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. • Strong programming ability in Python and data transformation skills using SQL and DBT. • Experience with Airflow for orchestration and scheduling. • Familiarity with third-party API integration and scalable data delivery methods. • Excellent communication and the ability to work in a collaborative, agile environment. HOW TO APPLY Please register your interest by sending your CV via the apply link on this page.