BrightBox Group Ltd

Python Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Python Developer with a contract length of unspecified duration, offering £400–£458 per day. Key skills include strong Python development, PySpark, Delta Lake, Docker, and Azure integration. Remote work is available.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
458
-
🗓️ - Date
January 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Delta Lake #Azure #Spark (Apache Spark) #Data Engineering #"ETL (Extract #Transform #Load)" #PySpark #Scala #Python #Data Governance #Cloud #Data Pipeline #DevOps #Security #Vault #Datasets #Docker #Debugging #Storage
Role description
SC Cleared Python Engineer Contract | £400–£458 per day (Inside IR35) SC Clearance is essential Remote We are seeking a highly capable Python-focused Data Engineer to join a delivery-driven team building and supporting complex data platforms in Azure. This role is heavily weighted towards Python software engineering rather than traditional ETL-only work. The successful candidate will be someone who writes clean, maintainable, and well-tested Python code, and is comfortable treating data pipelines as production-grade software. A significant portion of the work involves designing and maintaining complex, test-driven Python data flows, with PySpark used as the execution engine rather than the primary focus. Strong Python fundamentals, testing discipline, and code quality are critical to success in this role. What you’ll be doing • Designing and building scalable data pipelines with a Python-first approach • Developing complex data flows with a strong emphasis on clean architecture, reusable Python modules, and testability • Writing comprehensive unit tests and BDD tests (Behave), including mocking and patching • Using PySpark to process large-scale datasets while keeping business logic in Python • Creating, maintaining, and optimising Delta Lake tables for performance and reliability • Building and running applications in containerised (Docker) environments • Integrating Python applications with Azure services such as Azure Functions, Key Vault, and Blob Storage • Working closely with DevOps and engineering teams to support CI/CD pipelines • Debugging, tuning, and improving Python and Spark workloads in production • Following best practices in secure Python development, cloud security, and data governance What we’re looking for • Strong, hands-on Python development experience (essential) • Proven experience writing test-driven Python code in production environments • Solid data engineering experience using PySpark • Experience working with Delta Lake • Hands-on experience with Docker and containerised workflows • Good knowledge of Azure and integrating Python applications with cloud services • A software-engineering mindset and comfort working in fast-paced, delivery-focused teams