

Senior Python Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Python Engineer with a contract length of "unknown" and a negotiable pay rate. It requires strong Python skills, recent financial services experience, and proficiency in orchestration tools like Airflow and ADF. Hybrid work is based in London.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Outside IR35
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Pipeline #Datasets #Automation #Kafka (Apache Kafka) #Azure #Python #Airflow #Data Processing #Code Reviews #Databricks #"ETL (Extract #Transform #Load)" #Azure Data Factory #Data Engineering #Apache Airflow #Streamlit #Documentation #ADF (Azure Data Factory) #Snowflake #Data Integration
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Python Engineer - Hybrid (1 day a week in London) - Outside IR35 - Rate negotiable
Must have worked in Financial Services
We are seeking a skilled Python Developer to join our clients data engineering team, focusing on data processing, automation, and orchestration workflows within a reference data platform.
This is a technically demanding role that involves building and maintaining robust data pipelines, ensuring seamless orchestration across systems, and integrating with modern data platforms.
Key Responsibilities:
β’ Develop and maintain Python-based data processing solutions
β’ Design and implement orchestration workflows using tools such as Apache Airflow, Azure Data Factory (ADF), and Control-M
β’ Collaborate with cross-functional teams to optimize data integration and transformation processes
β’ Work with data platforms including Databricks, Snowflake, and Exadata to manage and manipulate large-scale datasets
β’ Integrate and manage event-driven architectures using Kafka and Event Hub
β’ Contribute to automation and efficiency improvements across the reference data platform
β’ Participate in code reviews, testing, and documentation
Key Skills & Experience:
β’ Very strong in Python, particularly for backend and data processing tasks
β’ Recently worked in financial services industry on internal systems
β’ Hands-on experience with orchestration tools such as Airflow, ADF, or Control-M
β’ Familiarity with data platforms such as Databricks, Snowflake, and Exadata
β’ Experience with event streaming technologies like Kafka and Event Hub
β’ Bonus: Exposure to Streamlit, Power Apps for lightweight UI development