

Queen Square Recruitment
Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Developer (PySpark + Fabric) based in London, offering a 6-month contract at £400–£425/day. Key skills include strong PySpark and Microsoft Fabric experience, data pipeline design, and familiarity with financial datasets.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
425
-
🗓️ - Date
November 1, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Security #GDPR (General Data Protection Regulation) #PySpark #Datasets #Scala #"ETL (Extract #Transform #Load)" #Spark SQL #Batch #BI (Business Intelligence) #Dataflow #DevOps #Spark (Apache Spark) #Security #Semantic Models #GIT #Data Engineering #Data Pipeline #Microsoft Power BI #SQL (Structured Query Language) #Compliance #Data Lake
Role description
Developer (PySpark + Fabric)
Location: London (Office-based)
Type: Contract – Inside IR35
Duration: 6 months (potential extension)
Day Rate: £400–£425/day (Inside IR35 – depending on experience)
About the Role:
We’re looking for an experienced PySpark + Fabric Developer to join a leading global financial markets and data services organisation. You’ll be part of a team driving large-scale data transformation and modernisation projects.
Key Responsibilities:
• Design, build, and optimise scalable data pipelines (batch & streaming).
• Develop and manage dataflows and semantic models supporting analytics.
• Implement robust data validations, cleansing, and performance tuning.
• Build ETL workflows using modern data engineering best practices.
• Collaborate with business stakeholders to deliver fit-for-purpose solutions.
• Ensure compliance with data security, access control, and governance.
Skills & Experience:
• Strong hands-on experience in PySpark (RDDs, DataFrames, Spark SQL).
• Proven experience in Microsoft Fabric, Data Lake, and ETL pipeline design.
• Familiarity with time-series data, market feeds, and financial datasets.
• Experience with Git, CI/CD pipelines, and DevOps practices.
• Excellent communication skills and the ability to work collaboratively.
Nice to Have:
• Knowledge of Power BI integration and OneLake.
• Understanding of financial regulations (GDPR, SOX).
Developer (PySpark + Fabric)
Location: London (Office-based)
Type: Contract – Inside IR35
Duration: 6 months (potential extension)
Day Rate: £400–£425/day (Inside IR35 – depending on experience)
About the Role:
We’re looking for an experienced PySpark + Fabric Developer to join a leading global financial markets and data services organisation. You’ll be part of a team driving large-scale data transformation and modernisation projects.
Key Responsibilities:
• Design, build, and optimise scalable data pipelines (batch & streaming).
• Develop and manage dataflows and semantic models supporting analytics.
• Implement robust data validations, cleansing, and performance tuning.
• Build ETL workflows using modern data engineering best practices.
• Collaborate with business stakeholders to deliver fit-for-purpose solutions.
• Ensure compliance with data security, access control, and governance.
Skills & Experience:
• Strong hands-on experience in PySpark (RDDs, DataFrames, Spark SQL).
• Proven experience in Microsoft Fabric, Data Lake, and ETL pipeline design.
• Familiarity with time-series data, market feeds, and financial datasets.
• Experience with Git, CI/CD pipelines, and DevOps practices.
• Excellent communication skills and the ability to work collaboratively.
Nice to Have:
• Knowledge of Power BI integration and OneLake.
• Understanding of financial regulations (GDPR, SOX).






