

Insight International (UK) Ltd
Data Engineer (Python, Databricks, Snowflake, ETL)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract in Glasgow, UK, requiring 4+ years in Python and data pipelines, 3+ years with Databricks and Snowflake, and strong ETL and data integration skills. On-site work is required three days a week.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 5, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#Data Integration #Apache Airflow #Snowflake #Databricks #Libraries #Scala #Big Data #BI (Business Intelligence) #NumPy #Agile #Pandas #REST (Representational State Transfer) #Airflow #"ETL (Extract #Transform #Load)" #Microsoft Power BI #PySpark #Python #Data Orchestration #Linux #REST API #GIT #Spark (Apache Spark) #Data Pipeline #Visualization #Database Administration #Hadoop #Data Engineering #Cloud #Data Processing
Role description
Role: Data Engineer (Python, Databricks, Snowflake, ETL)
Location: Glasgow, UK (3days/week On-Site)
Job Type: Contract
Skills / Qualifications:
• 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
• 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
• 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
• 3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
• Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
• Experience with code versioning tools (e.g., Git)
• Knowledge of Linux operating systems
• Familiarity with REST APIs and integration techniques
• Familiarity with data visualization tools and libraries (e.g., Power BI)
• Background in database administration or performance tuning
• Familiarity with data orchestration tools, such as Apache Airflow
• Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
• Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
• Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
• Self-starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities.
• Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.
Role: Data Engineer (Python, Databricks, Snowflake, ETL)
Location: Glasgow, UK (3days/week On-Site)
Job Type: Contract
Skills / Qualifications:
• 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
• 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
• 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
• 3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
• Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
• Experience with code versioning tools (e.g., Git)
• Knowledge of Linux operating systems
• Familiarity with REST APIs and integration techniques
• Familiarity with data visualization tools and libraries (e.g., Power BI)
• Background in database administration or performance tuning
• Familiarity with data orchestration tools, such as Apache Airflow
• Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
• Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
• Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
• Self-starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities.
• Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.






