

AWS Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Engineer on a 3-month contract, paying £400-£450 per day, fully remote. Key skills include AWS services, Python, SQL, DBT, and Airflow. Experience in data ingestion and third-party API integration is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
450
-
🗓️ - Date discovered
May 31, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Cloud #Data Ingestion #S3 (Amazon Simple Storage Service) #Scala #Monitoring #dbt (data build tool) #Programming #Python #"ETL (Extract #Transform #Load)" #Data Engineering #CRM (Customer Relationship Management) #Lambda (AWS Lambda) #AWS Lambda #AWS (Amazon Web Services) #Agile #Airflow #Observability #SQL (Structured Query Language) #API (Application Programming Interface) #R
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
AWS DATA ENGINEER
3-MONTH CONTRACT
£400-£450 PER DAY OUTSIDE IR35
FULLY REMOTE
This role offers a great opportunity for an AWS Data Engineer to join a fast-paced media company working on a cutting-edge data ingestion and transformation project. You'll take ownership of end-to-end pipeline development, integrating real-time data streams and third-party APIs into a modern analytics stack. The environment promotes autonomy, modern tooling, and close collaboration across technical and business teams.
THE COMPANY
This is a leading media organisation leveraging data to drive audience insights, engagement strategies, and targeted content delivery. They are investing heavily in their data platform and tooling, with an emphasis on real-time decision-making and scalable infrastructure. You'll be joining a team focused on delivering data into key business systems including CRM and analytics layers.
THE ROLE
You'll be part of a cross-functional data engineering team responsible for ingesting, transforming, and delivering data to critical internal systems. This includes designing scalable AWS-based pipelines, integrating external APIs, and orchestrating transformations using DBT and Airflow. You'll also support the transition of R-based data streams into more maintainable Python workflows.
Your responsibilities will include:
• Building and maintaining ingestion pipelines using AWS Lambda, API Gateway, and Kinesis.
• Integrating third-party APIs into the data platform and transforming data for CRM delivery.
• Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines.
• Ensuring observability and reliability using CloudWatch and automated monitoring.
• Supporting both BAU and new feature development within the data engineering function.
KEY SKILLS AND REQUIREMENTS
• Proven experience with AWS services including Lambda, API Gateway, S3, Kinesis, and CloudWatch.
• Strong programming ability in Python and data transformation skills using SQL and DBT.
• Experience with Airflow for orchestration and scheduling.
• Familiarity with third-party API integration and scalable data delivery methods.
• Excellent communication and the ability to work in a collaborative, agile environment.
HOW TO APPLY
Please register your interest by sending your CV via the apply link on this page.