

Infoplus Technologies UK Limited
Senior Data Engineer
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer based in Glasgow, UK (Hybrid, 3 days onsite) for a 6-month contract (Inside IR35). Key skills include Python, DataBricks, ETL processes, Snowflake, and agile methodologies.
๐ - Country
United Kingdom
๐ฑ - Currency
ยฃ GBP
-
๐ฐ - Day rate
Unknown
-
๐๏ธ - Date
October 29, 2025
๐ - Duration
More than 6 months
-
๐๏ธ - Location
Hybrid
-
๐ - Contract
Inside IR35
-
๐ - Security
Unknown
-
๐ - Location detailed
Glasgow, Scotland, United Kingdom
-
๐ง - Skills detailed
#Data Pipeline #Monitoring #Data Processing #Data Engineering #Code Reviews #Spark (Apache Spark) #Cloud #Automation #Snowflake #Data Orchestration #Data Integration #Database Administration #Apache Airflow #Microsoft Power BI #Python #Big Data #Scala #GIT #REST API #Linux #BI (Business Intelligence) #Programming #REST (Representational State Transfer) #Data Extraction #"ETL (Extract #Transform #Load)" #Agile #Databricks #Airflow #Visualization #Hadoop #Documentation #Libraries
Role description
ยท Job Title: Senior Data Engineer
ยท Location: Glasgow, UK (Hybrid, 3 days onsite)
ยท Duration: 6 months (Inside IR35)
Job Description:
Role Responsibilities
You will be responsible for:
โข Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks
โข Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs.
โข Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency.
โข Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations
โข Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives.
โข Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality.
โข Developing and maintain tooling and automation scripts to streamline repetitive tasks.
โข Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes
โข Utilizing REST APls and other integration techniques to connect various data sources
โข Maintaining documentation, including data flow diagrams, technical specifications, and processes.
You Have:
โข Proficiency in Python programming, including experience in writing efficient and maintainable code.
โข Hands-on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines
โข Proficiency in working with Snowflake or similar cloud-based data warehousing solutions
โข Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices
Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
โข Experience with code versioning tools (e.g., Git)
โข Meticulous attention to detail and a passion for problem solving
โข Knowledge of Linux operating systems
โข Familiarity with REST APIs and integration techniques
You might also have:
โข Familiarity with data visualization tools and libraries (e.g., Power BI)
โข Background in database administration or performance tuning
Familiarity with data orchestration tools, such as Apache Airflow
โข Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
โข Experience with ServiceNow integration
ยท Job Title: Senior Data Engineer
ยท Location: Glasgow, UK (Hybrid, 3 days onsite)
ยท Duration: 6 months (Inside IR35)
Job Description:
Role Responsibilities
You will be responsible for:
โข Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks
โข Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs.
โข Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency.
โข Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations
โข Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives.
โข Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality.
โข Developing and maintain tooling and automation scripts to streamline repetitive tasks.
โข Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes
โข Utilizing REST APls and other integration techniques to connect various data sources
โข Maintaining documentation, including data flow diagrams, technical specifications, and processes.
You Have:
โข Proficiency in Python programming, including experience in writing efficient and maintainable code.
โข Hands-on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines
โข Proficiency in working with Snowflake or similar cloud-based data warehousing solutions
โข Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices
Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
โข Experience with code versioning tools (e.g., Git)
โข Meticulous attention to detail and a passion for problem solving
โข Knowledge of Linux operating systems
โข Familiarity with REST APIs and integration techniques
You might also have:
โข Familiarity with data visualization tools and libraries (e.g., Power BI)
โข Background in database administration or performance tuning
Familiarity with data orchestration tools, such as Apache Airflow
โข Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
โข Experience with ServiceNow integration






