

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position on a 6-month fixed-term contract, paying £400.00-£420.00 per day. Key skills include Python, Pyspark, Snowflake, Azure Data Factory, and ETL experience. Requires strong analytical abilities and knowledge of DevOps processes.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
420
-
🗓️ - Date discovered
June 5, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Glasgow
-
🧠 - Skills detailed
#Spark (Apache Spark) #SQL Queries #SQL (Structured Query Language) #Security #Python #"ETL (Extract #Transform #Load)" #Azure Data Factory #Data Lake #Infrastructure as Code (IaC) #Databricks #Unit Testing #Azure #BI (Business Intelligence) #Snowflake #Data Warehouse #DevOps #PySpark #Agile #Data Engineering #ADLS (Azure Data Lake Storage) #ADF (Azure Data Factory) #Databases #RDBMS (Relational Database Management System)
Role description
Job Overview
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for developing, constructing, testing, and maintaining architectures such as databases and large-scale processing systems.
Responsibilities
Primary skills : Data, Python, pyspark
Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc.
Should have expertise in writing SQL queries against any RDBMS with query optimization.
Experience structuring Data Lake for reliability security and performance.
Experience implementing ETL for Data Warehouse and Business intelligence solutions.
Ability to write effective modular dynamic and robust code, establish code standards as well.
Strong analytical problem-solving and troubleshooting abilities.
Good understanding of unit testing software change management and software release management.
Knowledge of DevOps processes including CI/CD and Infrastructure as Code fundamentals
Experience performing root cause analysis on data & processes and identify opportunities for improvement.
Familiar with Agile software development methodologies.
Proactive communication and stakeholder management.
Job Type: Fixed term contractContract length: 6 months
Pay: £400.00-£420.00 per day
Experience:
Python: 4 years (required)
Pyspark: 4 years (required)
Snowflake: 2 years (required)
Azure Data Factory: 4 years (required)
Databricks : 3 years (required)
SQL queries : 3 years (required)
Data Lake : 3 years (required)
ETL for Data Warehouse: 4 years (required)
CI/CD and Infrastructure as Code: 3 years (required)
Job Overview
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for developing, constructing, testing, and maintaining architectures such as databases and large-scale processing systems.
Responsibilities
Primary skills : Data, Python, pyspark
Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc.
Should have expertise in writing SQL queries against any RDBMS with query optimization.
Experience structuring Data Lake for reliability security and performance.
Experience implementing ETL for Data Warehouse and Business intelligence solutions.
Ability to write effective modular dynamic and robust code, establish code standards as well.
Strong analytical problem-solving and troubleshooting abilities.
Good understanding of unit testing software change management and software release management.
Knowledge of DevOps processes including CI/CD and Infrastructure as Code fundamentals
Experience performing root cause analysis on data & processes and identify opportunities for improvement.
Familiar with Agile software development methodologies.
Proactive communication and stakeholder management.
Job Type: Fixed term contractContract length: 6 months
Pay: £400.00-£420.00 per day
Experience:
Python: 4 years (required)
Pyspark: 4 years (required)
Snowflake: 2 years (required)
Azure Data Factory: 4 years (required)
Databricks : 3 years (required)
SQL queries : 3 years (required)
Data Lake : 3 years (required)
ETL for Data Warehouse: 4 years (required)
CI/CD and Infrastructure as Code: 3 years (required)