

Optimization Specialist
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Optimization Specialist" with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Key skills include Apache Airflow, Azure cloud services, and CI/CD pipelines with Azure DevOps. Experience in data engineering is required.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Azure Databricks #Version Control #Deployment #Apache Airflow #Storage #Databricks #Airflow #DevOps #Azure cloud #Kubernetes #Scala #Synapse #Data Pipeline #YAML (YAML Ain't Markup Language) #Data Engineering #Logging #AutoScaling #Azure #Prometheus #Azure Data Factory #Microservices #Programming #Grafana #Monitoring #Agile #Python #Azure DevOps #ADF (Azure Data Factory) #Azure Log Analytics #Docker #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job description
Airflow Optimization Specialist β Azure Data Platform
Objective:
We are seeking an experienced and highly skilled Apache Airflow Subject Matter Expert (SME) to join our data engineering team. The primary objective of this role is to fine-tune and optimize our existing Airflow environment, ensuring high reliability, performance, and scalability. The ideal candidate will also bring strong expertise in Azure cloud services and Azure DevOps to design, solution, and develop robust orchestration frameworks that support our enterprise-scale data pipelines.
Key Responsibilities:
β’ Analyze and optimize the current Apache Airflow environment, identifying performance bottlenecks and implementing best practices for orchestration and scheduling.
β’ Design and implement scalable, modular, and reusable DAGs (Directed Acyclic Graphs) to support complex data workflows.
β’ Collaborate with data engineers and platform teams to integrate Airflow with Azure Data Factory, Azure Databricks, and other Azure-native services.
β’ Develop and maintain CI/CD pipelines using Azure DevOps for Airflow DAG deployment, testing, and version control.
β’ Establish monitoring, ing, and logging standards for Airflow jobs to ensure operational excellence and rapid incident response.
β’ Provide architectural guidance and hands-on support for new data pipeline development using Airflow and Azure services.
β’ Document Airflow configurations, deployment processes, and operational runbooks for internal teams.
β’ Mentor engineers and contribute to knowledge-sharing sessions on orchestration and workflow management.
Required Skills and Qualifications:
β’ Proven experience as an Apache Airflow SME or Lead Developer in a production-grade environment.
β’ Strong understanding of Airflow internals, including scheduler, executor types (Celery, Kubernetes), and plugin development.
β’ Experience with workload orchestration and autoscaling using KEDA (Kubernetes-based Event Driven Autoscaler), and familiarity with Celery for distributed task execution and background job processing, particularly in data pipeline or microservices environments
β’ Hands-on experience with Azure cloud services, especially Azure Data Factory, Azure Databricks, Azure Storage, and Azure Synapse.
β’ Proficiency in designing and deploying CI/CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling).
β’ Solid programming skills in Python, with experience in writing modular, testable, and reusable code.
β’ Familiarity with containerization (Docker) and orchestration (Kubernetes) as it relates to Airflow deployment.
β’ Experience with monitoring tools (e.g., Prometheus, Grafana, Azure Monitor) and log aggregation (e.g., ELK, Azure Log Analytics).
β’ Strong problem-solving skills and the ability to work independently in a fast-paced, agile environment.
β’ Excellent communication skills and the ability to collaborate effectively with cross-functional teams.