

Troy Consultancy
Python Developer (Apache Airflow)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer (Apache Airflow) with a contract length of "unknown," offering a pay rate of "unknown." Candidates must have 5+ years of Python experience and strong Apache Airflow, ETL, and cloud platform skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow City, Scotland, United Kingdom
-
🧠 - Skills detailed
#Data Warehouse #Data Engineering #Automation #dbt (data build tool) #Storage #DevOps #AWS (Amazon Web Services) #Azure #Data Integration #GCP (Google Cloud Platform) #Snowflake #Apache Airflow #Kafka (Apache Kafka) #Databases #Python #Spark (Apache Spark) #BigQuery #Observability #"ETL (Extract #Transform #Load)" #Monitoring #Debugging #PostgreSQL #Cloud #GIT #MongoDB #Docker #Redshift #Deployment #Airflow #Data Pipeline
Role description
We are looking for skilled Python Developers with strong Apache Airflow expertise to join our data engineering and automation team. The successful candidates will design, build, and maintain robust data pipelines and orchestration workflows across multiple data systems and environments.
Key Responsibilities
• Design, implement, and manage Apache Airflow DAGs for automated data pipelines.
• Develop and maintain Python-based ETL processes integrating APIs, databases, and cloud storage.
• Optimise Airflow performance and ensure monitoring, alerting, and fault-tolerant execution.
• Collaborate with DevOps and analytics teams to integrate pipelines with cloud infrastructure.
• Document workflows and participate in deployment reviews and CI/CD improvements.
Required Skills & Experience
• Minimum 5 years of hands-on Python development experience.
• Proven expertise in Apache Airflow (authoring DAGs, operators, sensors, and custom hooks).
• Experience with PostgreSQL, MongoDB, and RESTful APIs.
• Strong background in data integration, ETL orchestration, and scheduling.
• Working knowledge of Docker, Git, and cloud platforms (AWS/GCP/Azure).
• Excellent analytical and debugging skills, with attention to detail. Nice-to-Have
• Familiarity with dbt, Kafka, or Spark.
• Exposure to data warehouse environments (e.g., BigQuery, Snowflake, Redshift).
• Understanding of event-driven architectures and workflow observability tools.
We are looking for skilled Python Developers with strong Apache Airflow expertise to join our data engineering and automation team. The successful candidates will design, build, and maintain robust data pipelines and orchestration workflows across multiple data systems and environments.
Key Responsibilities
• Design, implement, and manage Apache Airflow DAGs for automated data pipelines.
• Develop and maintain Python-based ETL processes integrating APIs, databases, and cloud storage.
• Optimise Airflow performance and ensure monitoring, alerting, and fault-tolerant execution.
• Collaborate with DevOps and analytics teams to integrate pipelines with cloud infrastructure.
• Document workflows and participate in deployment reviews and CI/CD improvements.
Required Skills & Experience
• Minimum 5 years of hands-on Python development experience.
• Proven expertise in Apache Airflow (authoring DAGs, operators, sensors, and custom hooks).
• Experience with PostgreSQL, MongoDB, and RESTful APIs.
• Strong background in data integration, ETL orchestration, and scheduling.
• Working knowledge of Docker, Git, and cloud platforms (AWS/GCP/Azure).
• Excellent analytical and debugging skills, with attention to detail. Nice-to-Have
• Familiarity with dbt, Kafka, or Spark.
• Exposure to data warehouse environments (e.g., BigQuery, Snowflake, Redshift).
• Understanding of event-driven architectures and workflow observability tools.






