

Airflow Optimization Specialist – Azure Data Platform
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Airflow Optimization Specialist – Azure Data Platform" on a freelance contract for up to £500/day, hybrid in London. Requires expertise in Apache Airflow, Python, cloud services (Azure, AWS, GCP), and CI/CD tools.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date discovered
July 10, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London EC4A 4HH
-
🧠 - Skills detailed
#Kubernetes #Airflow #Data Engineering #Deployment #Logging #Scala #AutoScaling #Grafana #Programming #Jenkins #Docker #Prometheus #Monitoring #Automation #DevOps #Cloud #Azure #Azure DevOps #Python #Data Pipeline #AWS (Amazon Web Services) #Apache Airflow #Storage #GitHub #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are seeking an experienced Apache Airflow Subject Matter Expert (SME), (Contract , Hybrid - London) to join our Data Engineering team. You will be responsible for optimizing Airflow environments, building scalable orchestration frameworks, and supporting enterprise-scale data pipelines, while collaborating with cross-functional teams.
Key Responsibilities & Skills:
Optimize and fine-tune existing Apache Airflow environments, addressing performance and reliability.
Design and develop scalable, modular, and reusable Airflow DAGs for complex data workflows.
Integrate Airflow with cloud-native services such as data factories, compute platforms, storage, and analytics.
Develop and maintain CI/CD pipelines for DAG deployment, testing, and release automation.
Implement monitoring, alerting, and logging standards to ensure operational excellence.
Provide architectural guidance and hands-on support for new data pipeline development.
Document Airflow configurations, deployment processes, and operational procedures.
Mentor engineers and lead knowledge-sharing on orchestration best practices.
Expertise in Airflow internals, including schedulers, executors (Celery, Kubernetes), and plugins.
Experience with autoscaling solutions (KEDA) and Celery for distributed task execution.
Strong hands-on skills in Python programming and modular code development.
Proficiency with cloud services (Azure, AWS, or GCP), including data pipelines, compute, and storage.
Solid experience with CI/CD tools such as Azure DevOps, Jenkins, or GitHub Actions.
Familiarity with Docker, Kubernetes, and related deployment technologies.
Strong background in monitoring tools (Prometheus, Grafana) and log aggregation (ELK, Log Analytics).
Excellent problem-solving, communication, and collaboration skills.
If this role interests you, please send your CV to info@pixelcodetech.co.uk.
Job Type: Freelance
Pay: Up to £500.00 per day
Schedule:
Day shift
Work Location: Hybrid remote in London EC4A 4HH