Glocomms

Cloud-Based Python Engineer - Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud-Based Python Engineer on a remote contract, offering a pay rate of "TBD". Requires a Bachelor's degree, 4-5 years in Python and SQL, experience in financial services preferred, and proficiency in RESTful APIs and data applications.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 9, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Docker #Stories #Web Services #Business Analysis #Redis #Documentation #Data Engineering #Computer Science #Mathematics #NoSQL #Leadership #Data Warehouse #Elastic Stack #PostgreSQL #Airflow #Python #Scrum #Cybersecurity #Cloud #Security #Kafka (Apache Kafka) #Agile #Programming #SQL (Structured Query Language) #Ansible #Data Pipeline #Storage
Role description
Cloud-Based Python Engineer - Remote We are currently seeking a highly skilled and motivated Cloud-Based Python Engineer to join our team on a remote contract basis. This opportunity is with a leading investment firm that is committed to improving clients' lives by enhancing the way they invest. The engineering team emphasises collaboration, continuous learning, and innovation to solve the firm's most critical business challenges. As part of the engineering team, you will be embedded within the Data Distribution department. This team is responsible for managing enterprise investment data warehouse, which supports key functions such as Research, Portfolio Management, Trading, and Analytics. You will have the opportunity to work closely with stakeholders to understand their needs, collaborate on the design of solutions, and leverage emerging data engineering tools and best practices. Role Overview In this role, you will be responsible for designing, developing, and testing multiple application services with a focus on building data platforms and services. You will also expand and optimise our data and data pipeline architecture. The ideal candidate will demonstrate strong technical and analytical abilities across multiple tech stacks, be detail-oriented, and passionate about building data systems from the ground up. Key Responsibilities β€’ Deliver investment data technology solutions for Research, Portfolio Management, Trading, Analytics, and Reporting. β€’ Design, develop, test, and deploy data technology solutions with a focus on speed and quality. β€’ Collaborate with business analysts, product owners, and project managers to define user stories, estimates, and work plans. β€’ Work independently and advise clients and IT leadership on technology capabilities and strategies. β€’ Design and implement changes across data pipelines: ingestion, validation, integration, storage, and delivery. β€’ Write unit and integration tests, contribute to engineering documentation, and maintain internal wikis. β€’ Build high-performance, data-transfer tools that meet SLA requirements. β€’ Ensure data consistency, refresh rates, and caching across multiple interfaces. β€’ Enhance CI/CD pipelines and participate in code/design reviews. β€’ Provide technical troubleshooting and production support. Required Qualifications β€’ Bachelor's degree in Engineering, Mathematics, Computer Science, or a related field. Master's degree is preferred. β€’ 4-5 years of programming experience in Python or equivalent. Experience with multiple languages/platforms is a plus. β€’ Proficiency in building RESTful APIs and web services. β€’ 4-5 years of SQL experience (NoSQL experience is a plus). β€’ Familiarity with SOLID principles and Domain-Driven Design. β€’ Experience with high-performance, high-availability data applications. β€’ Strong skills in performance tuning. β€’ Experience with automated acceptance testing and writing maintainable, unit-tested code. β€’ Understanding of cybersecurity risks and secure application design. β€’ Experience in collaborative, agile development environments. β€’ Knowledge of best practices in always-up, always-available services. β€’ Experience with Agile Scrum and Waterfall methodologies. β€’ Strong analytical and problem-solving skills. β€’ Excellent written and verbal communication skills. Preferred Qualifications β€’ Experience in the financial services industry is a plus. β€’ Familiarity with the following technologies: β€’ Kafka β€’ Airflow β€’ PostgreSQL β€’ Ansible β€’ Elastic Stack β€’ RabbitMQ β€’ Redis β€’ Docker β€’ Knowledge of: β€’ Okta β€’ OAuth2 β€’ PlainID