

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position based in Manchester for 3-6 months at up to £345/day. Key skills include DBT, Snowflake, and PL/SQL, with 5+ years of experience required in data engineering and strong data modeling knowledge.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
345
-
🗓️ - Date discovered
August 30, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#SQL (Structured Query Language) #dbt (data build tool) #GIT #"ETL (Extract #Transform #Load)" #Data Quality #BI (Business Intelligence) #Version Control #Data Science #Cloud #Documentation #Snowflake #Airflow #Scala #Clustering #Security #Data Engineering #Monitoring #Code Reviews #Data Analysis #Data Processing
Role description
Job Description
Data Engineer
Start: ASAP
Duration: 3-6 months
Location: Manchester (3 days per week)
Pay: up to £345/day (inside IR35)
We’re looking for an experienced Data Engineer with deep expertise in DBT, Snowflake, and PL/SQL to join our growing data team on a contract basis. This role will be instrumental in designing, building, and maintaining high-quality data transformation pipelines that support business intelligence, analytics, and data science across the organisation.
Key Responsibilities
• Design and implement scalable data models and transformation pipelines using DBT on Snowflake
• Develop efficient and maintainable PL/SQL code for complex data processing and transformations
• Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions
• Optimise Snowflake performance through query tuning, clustering, and resource management
• Uphold data quality and integrity through rigorous testing, documentation, and monitoring
• Participate in code reviews, architecture planning, and continuous improvement efforts
• Maintain and improve CI/CD pipelines for DBT projects
Required Skills & Experience
• 5+ years' experience in data engineering or a related role
• Proven hands-on experience with DBT (modular SQL development, testing, documentation)
• Strong expertise in Snowflake (data warehousing, performance tuning, security)
• Advanced knowledge of PL/SQL, including stored procedures, functions, and packages
• Solid understanding of data modelling concepts (e.g. star/snowflake schemas, normalisation)
• Experience with version control tools (e.g. Git) and CI/CD best practices
• Familiarity with orchestration tools such as Airflow, dbt Cloud, or Prefect is a plus
Job Description
Data Engineer
Start: ASAP
Duration: 3-6 months
Location: Manchester (3 days per week)
Pay: up to £345/day (inside IR35)
We’re looking for an experienced Data Engineer with deep expertise in DBT, Snowflake, and PL/SQL to join our growing data team on a contract basis. This role will be instrumental in designing, building, and maintaining high-quality data transformation pipelines that support business intelligence, analytics, and data science across the organisation.
Key Responsibilities
• Design and implement scalable data models and transformation pipelines using DBT on Snowflake
• Develop efficient and maintainable PL/SQL code for complex data processing and transformations
• Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions
• Optimise Snowflake performance through query tuning, clustering, and resource management
• Uphold data quality and integrity through rigorous testing, documentation, and monitoring
• Participate in code reviews, architecture planning, and continuous improvement efforts
• Maintain and improve CI/CD pipelines for DBT projects
Required Skills & Experience
• 5+ years' experience in data engineering or a related role
• Proven hands-on experience with DBT (modular SQL development, testing, documentation)
• Strong expertise in Snowflake (data warehousing, performance tuning, security)
• Advanced knowledge of PL/SQL, including stored procedures, functions, and packages
• Solid understanding of data modelling concepts (e.g. star/snowflake schemas, normalisation)
• Experience with version control tools (e.g. Git) and CI/CD best practices
• Familiarity with orchestration tools such as Airflow, dbt Cloud, or Prefect is a plus