

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with strong experience in Python, PySpark, and SQL, on a 3-month fixed-term contract in Glasgow (hybrid). Pay is £42.00-£45.00 per hour. Key skills include data architectures, pipelines, and performance tuning.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
360
-
🗓️ - Date discovered
September 3, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Glasgow G5
-
🧠 - Skills detailed
#Data Architecture #BitBucket #Data Lake #Security #AWS (Amazon Web Services) #Java #Data Warehouse #PySpark #Spark (Apache Spark) #SQL (Structured Query Language) #GIT #Version Control #GitLab #Data Engineering #Data Pipeline #Python #Big Data
Role description
We are hiring for Data Engineer (Python / PySpark / Data pipelines / Big Data)
Location : Glasgow - Hybrid
Strong experience with Python, PySpark, and SQL.
Build and maintain robust data architectures and pipelines to ensure durable, complete, and consistent data transfer and processing.
Proficiency in Core Java, including Collections, Concurrency, and Memory Management.
Design and implement data warehouses and data lakes that can handle large volumes of data and meet all security requirements.
A solid background in performance tuning, profiling, and resolving production issues in distributed systems.
Experience with version control systems like Git, GitLab, or Bitbucket & AWS is a plus.
Key Skills : Data architectures / Data pipelines / data warehouses / data lakes / Python / PySpark
Job Type: Fixed term contractContract length: 3 months
Pay: £42.00-£45.00 per hour
Expected hours: 40 per week
Benefits:
Flexitime
Application question(s):
Do you have strong experience with Python, PySpark, and SQL ?
Are within the commutable distance to Glasgow ?
We are hiring for Data Engineer (Python / PySpark / Data pipelines / Big Data)
Location : Glasgow - Hybrid
Strong experience with Python, PySpark, and SQL.
Build and maintain robust data architectures and pipelines to ensure durable, complete, and consistent data transfer and processing.
Proficiency in Core Java, including Collections, Concurrency, and Memory Management.
Design and implement data warehouses and data lakes that can handle large volumes of data and meet all security requirements.
A solid background in performance tuning, profiling, and resolving production issues in distributed systems.
Experience with version control systems like Git, GitLab, or Bitbucket & AWS is a plus.
Key Skills : Data architectures / Data pipelines / data warehouses / data lakes / Python / PySpark
Job Type: Fixed term contractContract length: 3 months
Pay: £42.00-£45.00 per hour
Expected hours: 40 per week
Benefits:
Flexitime
Application question(s):
Do you have strong experience with Python, PySpark, and SQL ?
Are within the commutable distance to Glasgow ?