

Airflow Subject Matter Expert (SME)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Airflow Subject Matter Expert (SME) on a 6-month contract in London, requiring expertise in Apache Airflow, Azure services, and CI/CD pipelines. Key skills include Python programming, workload orchestration, and containerization.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 10, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Kubernetes #Airflow #Azure Data Factory #Synapse #Deployment #Azure cloud #Microservices #Azure Databricks #Scala #Agile #AutoScaling #Grafana #Programming #Docker #Prometheus #YAML (YAML Ain't Markup Language) #Azure Log Analytics #Monitoring #DevOps #Cloud #Azure #Azure DevOps #Python #Data Pipeline #Apache Airflow #Databricks #Storage #ADF (Azure Data Factory)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
RED Global are currently looking for a Airflow Subject Matter Expert (SME) to help optimize our customers existing Airflow environment, ensuring high reliability, performance, and scalability. This position is a 6 month contract + extensions and requires someone who is available to be in London 3 days per week.
We are looking for the following:
β’ Proven experience as an Apache Airflow SME or Lead Developer in a production-grade environment.
β’ Strong understanding of Airflow internals, including scheduler, executor types (Celery, Kubernetes), and plugin development.
β’ Experience with workload orchestration and autoscaling using KEDA (Kubernetes-based Event Driven Autoscaler), and familiarity with Celery for distributed task execution and background job processing, particularly in data pipeline or microservices environments
β’ Hands-on experience with Azure cloud services, especially Azure Data Factory, Azure Databricks, Azure Storage, and Azure Synapse.
β’ Proficiency in designing and deploying CI/CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling).
β’ Solid programming skills in Python, with experience in writing modular, testable, and reusable code.
β’ Familiarity with containerization (Docker) and orchestration (Kubernetes) as it relates to Airflow deployment.
β’ Experience with monitoring tools (e.g., Prometheus, Grafana, Azure Monitor) and log aggregation (e.g., ELK, Azure Log Analytics).
β’ Strong problem-solving skills and the ability to work independently in a fast-paced, agile environment.
β’ Excellent communication skills and the ability to collaborate effectively with cross-functional teams.
If this sounds like something of interest then please apply directly and someone from the team will be in touch to discuss the requirement further.