

Source Group International
Python Quant Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Quant Developer on a contract basis, requiring expertise in Python, statistical libraries, and cloud platforms (AWS/Azure). Ideal candidates should have financial modeling experience and familiarity with data engineering tools. Pay rate and contract length unspecified.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Libraries #Data Architecture #Python #Snowflake #Data Pipeline #Airflow #Scala #"ETL (Extract #Transform #Load)" #Data Engineering #Automation #Programming #Pandas #Cloud #Deployment #Data Transformations #Azure #Version Control #Data Analysis #Apache Airflow #NumPy #AWS (Amazon Web Services) #Documentation
Role description
Role Overview:
The Quantitative Developer will design, implement, and maintain analytical models and processes, with a strong focus on Python development, leveraging statistical libraries, and utilising cloud platforms such as AWS, Azure and Snowflake for scalable deployment.
Key Responsibilities:
• Develop and optimise quantitative models and analytical tools in Python
• Leverage statistical g packages (e.g., pandas, NumPy) for data analysis and modelling
• Work closely with portfolio managers, researchers, and technology teams to deploy robust, scalable solutions
• Design and implement data pipelines and workflows utilising AWS or Azure infrastructures
• Ensure best practices in code quality, version control, and deployment automation
• Provide technical support and documentation for developed solutions
Required Skills and Experience:
• Proven experience in Python programming for quantitative or analytical applications
• Strong proficiency with relevant statistical libraries Pyth
• Practical experience utilising AWS and/or Azure for deployment and workflow orchestration
• Solid understanding of financial data and modelling techniques (preferred)
• Excellent analytical, communication, and problem-solving skills
• Experience with data engineering & ETL tools such as Apache Airflow or custom ETL scripts.
• Strong problem-solving skills with a keen analytical mindset especially in handling large data sets and complex data transformations.
• Strong experience in setting up an underlying test framework for complex data calculations and processing.
• Strong familiarity with engineering and data architecture principles.
• Prior experience in a front office or financial services environment (desirable)
Role Overview:
The Quantitative Developer will design, implement, and maintain analytical models and processes, with a strong focus on Python development, leveraging statistical libraries, and utilising cloud platforms such as AWS, Azure and Snowflake for scalable deployment.
Key Responsibilities:
• Develop and optimise quantitative models and analytical tools in Python
• Leverage statistical g packages (e.g., pandas, NumPy) for data analysis and modelling
• Work closely with portfolio managers, researchers, and technology teams to deploy robust, scalable solutions
• Design and implement data pipelines and workflows utilising AWS or Azure infrastructures
• Ensure best practices in code quality, version control, and deployment automation
• Provide technical support and documentation for developed solutions
Required Skills and Experience:
• Proven experience in Python programming for quantitative or analytical applications
• Strong proficiency with relevant statistical libraries Pyth
• Practical experience utilising AWS and/or Azure for deployment and workflow orchestration
• Solid understanding of financial data and modelling techniques (preferred)
• Excellent analytical, communication, and problem-solving skills
• Experience with data engineering & ETL tools such as Apache Airflow or custom ETL scripts.
• Strong problem-solving skills with a keen analytical mindset especially in handling large data sets and complex data transformations.
• Strong experience in setting up an underlying test framework for complex data calculations and processing.
• Strong familiarity with engineering and data architecture principles.
• Prior experience in a front office or financial services environment (desirable)