Data Engineer - Python

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Python with a contract length of "Unknown," offering a pay rate of "Unknown." Located in Glasgow, it requires proficiency in Python, experience with Databricks and cloud-based data warehousing, and familiarity with Agile practices.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
550
-
πŸ—“οΈ - Date discovered
August 8, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#API (Application Programming Interface) #BI (Business Intelligence) #Automation #"ETL (Extract #Transform #Load)" #GIT #Programming #Scala #Data Modeling #Big Data #Cloud #Python #REST (Representational State Transfer) #Hadoop #Linux #Airflow #Data Warehouse #Agile #Databricks #Data Engineering #Microsoft Power BI #Documentation #Monitoring #Visualization #Snowflake #REST API #Spark (Apache Spark) #Apache Airflow #Data Orchestration #Data Processing #Data Integration
Role description
About the Role A leading global financial services organization is looking for a Data Engineer to join their Enterprise Technology division in Glasgow - 3 days in the office. This is an exciting opportunity to work within a high-performing, global engineering team responsible for building, maintaining, and scaling platforms that underpin the organization's critical business operations. You'll join a dynamic development chapter of 70+ engineers, contributing to strategic platforms that support 10,000+ technology professionals worldwide. You'll work with the latest tools and cloud platforms to drive automation, data integration, and real-time performance monitoring. Key Responsibilities β€’ Design and build efficient, scalable, and secure ETL pipelines using Python and Databricks β€’ Integrate and manage cloud-based data warehouses (e.g. Snowflake) β€’ Work with REST APIs and data services to connect and consolidate data across systems β€’ Maintain and optimize data workflows, ensuring performance, scalability, and resilience β€’ Collaborate in a global Agile team-participating in sprint planning, reviews, and stand-ups β€’ Apply rigorous testing and monitoring practices to ensure high reliability of data processes β€’ Contribute to best practices, documentation, and internal tooling for enhanced developer productivity What We're Looking For β€’ Proficiency in Python programming with an emphasis on clean, maintainable code β€’ Experience with Databricks or similar platforms for distributed data processing β€’ Strong understanding of ETL principles, data modeling, and data integration β€’ Hands-on experience with cloud-based data warehousing (e.g. Snowflake) β€’ Knowledge of Linux, Git, and RESTful API integrations β€’ Familiarity with Agile development practices Desirable Skills β€’ Experience with data orchestration tools like Apache Airflow β€’ Background in big data technologies (Spark, Hadoop) β€’ Exposure to visualization tools such as Power BI β€’ Understanding of ServiceNow integrations or performance tuning β€’ Knowledge of monitoring and APM solutions across large-scale systems Why Apply? β€’ Work on high-impact platforms used by global engineering teams β€’ Join a diverse, inclusive, and international team of technologists β€’ Hybrid working model with flexible work-life balance β€’ Competitive compensation, professional growth opportunities, and comprehensive benefits β€’ Centrally located Glasgow office with onsite gym, restaurant, and modern facilities We are committed to creating an inclusive recruitment experience. If you have a disability or long-term health condition and require adjustments to the recruitment process, our Adjustment Concierge Service is here to support you. Please reach out to us at adjustments@robertwalters.com to discuss further.