

Square One Resources
AWS Cloud/Data Engineer - Airflow
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Cloud/Data Engineer - Airflow, offering up to £400 per day INSIDE IR35, based in the Manchester area, requiring 3 days in-office. Key skills include AWS, Airflow, SQL, and DevOps, with financial services experience preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
400
-
🗓️ - Date
October 15, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Manchester Area, United Kingdom
-
🧠 - Skills detailed
#Data Lake #Scala #AWS DevOps #Version Control #SQL (Structured Query Language) #DevOps #GitHub #Deployment #Monitoring #Data Engineering #Security #"ETL (Extract #Transform #Load)" #Storage #AWS (Amazon Web Services) #Airflow #Data Pipeline #Cloud #Logging #GitLab
Role description
Job Title: AWS DevOps/Data Engineer - AirFlow
Location: Manchester Area - 3 days per week in the office
Salary/Rate: Up to £400 per day INSIDE IR35
Start Date: 03/11/2025
Job Type: Contract
Company Introduction
We have an exciting opportunity now available with one of our sector-leading financial services clients! They are currently looking for a skilled AWS DevOps/Data Engineer to join their team for an initial contract until the end of the year.
Job Responsibilities/Objectives
• Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments.
• Facilitate cross-functional integration between vendor products and other systems, such as data lakes, storage, and compute services.
• Establish best practices for cloud security, scalability, and performance.
• Manage and configure vendor product deployments, ensuring the setup and maintenance of environments.
• Ensure high availability, scalability, and fault tolerance of Airflow clusters.
• Implement monitoring, alerting, and logging for Airflow and related components.
• Perform upgrades and patches for platform-related components.
• Oversee capacity planning, resource allocation, and optimization of Airflow workers.
• Maintain and configure integrations with source control systems (e.g., GitHub, GitLab) for version control.
• Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements.
• Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimising data delivery, and automating manual processes.
• Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies.
• Work with stakeholders, including design, product, and executive teams, to address platform-related technical issues.
• Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition
Job Title: AWS DevOps/Data Engineer - AirFlow
Location: Manchester Area - 3 days per week in the office
Salary/Rate: Up to £400 per day INSIDE IR35
Start Date: 03/11/2025
Job Type: Contract
Company Introduction
We have an exciting opportunity now available with one of our sector-leading financial services clients! They are currently looking for a skilled AWS DevOps/Data Engineer to join their team for an initial contract until the end of the year.
Job Responsibilities/Objectives
• Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments.
• Facilitate cross-functional integration between vendor products and other systems, such as data lakes, storage, and compute services.
• Establish best practices for cloud security, scalability, and performance.
• Manage and configure vendor product deployments, ensuring the setup and maintenance of environments.
• Ensure high availability, scalability, and fault tolerance of Airflow clusters.
• Implement monitoring, alerting, and logging for Airflow and related components.
• Perform upgrades and patches for platform-related components.
• Oversee capacity planning, resource allocation, and optimization of Airflow workers.
• Maintain and configure integrations with source control systems (e.g., GitHub, GitLab) for version control.
• Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements.
• Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimising data delivery, and automating manual processes.
• Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies.
• Work with stakeholders, including design, product, and executive teams, to address platform-related technical issues.
• Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition