

Stott and May
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer in Warwick, UK, for 6 months at a market rate (Inside IR35). Key skills include Azure Data Factory, Databricks, Python, and SQL. A minimum of 6 years of relevant experience is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Warwick, England, United Kingdom
-
🧠 - Skills detailed
#Data Integration #Deployment #"ETL (Extract #Transform #Load)" #Agile #Security #Data Quality #Databases #Documentation #Datasets #Azure #Data Architecture #Data Engineering #Cloud #ADF (Azure Data Factory) #Azure Data Factory #SQL (Structured Query Language) #Data Storage #Debugging #Storage #Data Processing #Python #Data Pipeline #Snowflake #Monitoring #Databricks #Scala
Role description
Job Description
Data Engineer
Location: Warwick, UK (office based)
Day Rate: Market rate (Inside IR35)
Duration: 6 months
Role description
This role is for an experienced Azure Data Engineer to support the design, development and optimisation of data solutions within a modern cloud-based environment. The successful candidate will contribute to building scalable and reliable data pipelines, enabling data-driven decision-making across the organisation.
Working within a fast-paced delivery environment, the role offers exposure to cloud technologies and advanced data platforms, supporting the transformation and integration of large and complex datasets. The position plays a key role in ensuring data quality, accessibility and performance across business-critical systems.
Key Responsibilities
• Design, build and maintain scalable data pipelines using Azure data services
• Develop and manage data integration workflows using Azure Data Factory
• Perform data transformation and processing using tools such as Databricks and Python
• Troubleshoot and debug data pipelines to ensure reliability and performance
• Collaborate with stakeholders to understand data requirements and translate them into technical solutions
• Optimise data storage and processing using platforms such as Snowflake
• Ensure data quality, governance and security standards are maintained
• Support deployment and monitoring of data solutions in production environments
• Document data processes and maintain technical documentation
Key Skills, Knowledge And Experience
• Strong experience with Azure data services, including Azure Data Factory
• Advanced expertise in Azure Data Factory development and optimisation
• Experience working with Snowflake for data warehousing solutions
• Hands-on experience with Databricks for large-scale data processing
• Proficiency in Python for data engineering tasks
• Strong SQL knowledge and experience working with relational databases
• Experience in debugging and resolving data pipeline issues
• Understanding of data architecture and best practices in data engineering
• Good communication skills with the ability to work collaboratively
Desirable Skills, Knowledge And Experience
• 6 to 8 years of relevant data engineering experience
• Exposure to additional cloud data platforms or modern data stack tools
• Experience working in agile delivery environments
Job Description
Data Engineer
Location: Warwick, UK (office based)
Day Rate: Market rate (Inside IR35)
Duration: 6 months
Role description
This role is for an experienced Azure Data Engineer to support the design, development and optimisation of data solutions within a modern cloud-based environment. The successful candidate will contribute to building scalable and reliable data pipelines, enabling data-driven decision-making across the organisation.
Working within a fast-paced delivery environment, the role offers exposure to cloud technologies and advanced data platforms, supporting the transformation and integration of large and complex datasets. The position plays a key role in ensuring data quality, accessibility and performance across business-critical systems.
Key Responsibilities
• Design, build and maintain scalable data pipelines using Azure data services
• Develop and manage data integration workflows using Azure Data Factory
• Perform data transformation and processing using tools such as Databricks and Python
• Troubleshoot and debug data pipelines to ensure reliability and performance
• Collaborate with stakeholders to understand data requirements and translate them into technical solutions
• Optimise data storage and processing using platforms such as Snowflake
• Ensure data quality, governance and security standards are maintained
• Support deployment and monitoring of data solutions in production environments
• Document data processes and maintain technical documentation
Key Skills, Knowledge And Experience
• Strong experience with Azure data services, including Azure Data Factory
• Advanced expertise in Azure Data Factory development and optimisation
• Experience working with Snowflake for data warehousing solutions
• Hands-on experience with Databricks for large-scale data processing
• Proficiency in Python for data engineering tasks
• Strong SQL knowledge and experience working with relational databases
• Experience in debugging and resolving data pipeline issues
• Understanding of data architecture and best practices in data engineering
• Good communication skills with the ability to work collaboratively
Desirable Skills, Knowledge And Experience
• 6 to 8 years of relevant data engineering experience
• Exposure to additional cloud data platforms or modern data stack tools
• Experience working in agile delivery environments






