Insight Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract-to-hire, with a pay rate of "pay rate" based in London. Key skills include Azure, SQL, Python, dbt, and Azure DevOps. Financial data experience is preferred.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 1, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure #Datasets #Snowflake #"ETL (Extract #Transform #Load)" #Scala #Azure cloud #Airflow #DevOps #Automation #Azure DevOps #dbt (data build tool) #Data Engineering #Infrastructure as Code (IaC) #Data Pipeline #Cloud #GitHub #Data Quality #Terraform #Python #SQL (Structured Query Language) #Compliance
Role description
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and operational data domains will stand out. This is a full-time onsite position in our London office, working closely with global teams to ensure data quality, automation, and continuous improvement. Please note, this is a 6 month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: β€’ Develop and maintain Azure-based data pipelines for Finance and Operations. β€’ Build and optimize ETL workflows using SQL and dbt. β€’ Write Python scripts for data transformation and automation. β€’ Deploy infrastructure as code and manage cloud data solutions. β€’ Collaborate with project managers and contractors across global teams. β€’ Ensure data quality and compliance with best practices. β€’ Troubleshoot and resolve data-related issues promptly. β€’ Document technical solutions and maintain test scripts. Must Haves: β€’ Strong experience in data engineering. β€’ Expertise in Azure (cloud platform) β€’ SQL (advanced ETL and query optimization) β€’ dbt (data transformation pipelines) β€’ Python (data transformation and automation scripts) β€’ Azure DevOps / GitHub (CI/CD pipelines, source control) β€’ Data warehousing and ETL best practices Plusses: β€’ Experience with similar tools or technologies (e.g., Snowflake, Airflow, Terraform) β€’ Familiarity with infrastructure as code β€’ Ability to participate in architectural decisions β€’ Strong problem-solving and continuous improvement mindset