

Optimal
Senior Data Engineer | Azure | Snowflake | Fivetran | DBT | ETL / ELT | APIs
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with expertise in Azure, Snowflake, Fivetran, DBT, Python, and APIs. The contract is initially 9 months, paying £225 per day, with remote or hybrid work options. Strong experience in ETL/ELT pipelines is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
225
-
🗓️ - Date
February 11, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Python #Big Data #Data Warehouse #Data Quality #Azure #Data Ingestion #Fivetran #API (Application Programming Interface) #Airflow #Data Engineering #Scala #Data Access #"ETL (Extract #Transform #Load)" #Data Lake #Datasets #SQL (Structured Query Language) #BI (Business Intelligence) #Data Pipeline #Data Science #dbt (data build tool) #Databases #Snowflake
Role description
Senior Data Engineer | Azure | Snowflake | Fivetran | DBT | ETL / ELT | APIs
Rate: £225 per day (Outside IR35)
Contract: Initially 9 months - likely to extend
Start: ASAP
Working model: Remote depending on location or Hybrid 1 day p/w
Working hours: UK (GMT)
Required: Azure, Fivetran (Airbyte or Airflow is considered), Snowflake, Python & APIs (Non Negotiable)
🚨 Non‑Negotiables - Please only apply if you have ALL of the following 🚨
Core technologies:
• Snowflake (data warehouse)
• Fivetran (data connectors)
• DBT (data transformation)
• Python (including building custom API connectors)
Environment:
Strong experience working in Azure‑based data environments
• Proven, hands‑on experience as a Data Engineer
• Strong experience building and maintaining ETL/ELT pipelines
• Solid understanding of data warehousing and data modelling concepts
Key skills:
• Experience with Fivetran, Snowflake, Python, and APIs
• Strong knowledge of connecting to APIs for data ingestion
• Familiarity with similar tools (Airbyte, Airflow, etc.) acceptable if you understand the core principles of API‑based ingestion
• Strong SQL skills and experience with structured and semi‑structured data
• Experience ensuring data quality, consistency, and governance
• Ability to work UK (GMT) hours
• High‑level written and spoken English, with the ability to communicate clearly with a global team
Role Overview
We’re looking for an experienced Senior Level Data Engineer to join a fast‑moving team responsible for building and scaling a modern, Azure‑native data platform. You’ll play a key role in designing, developing, and maintaining robust data infrastructure that supports analytics, reporting, and business decision‑making across the organisation.
This is a hands‑on role focused on data pipelines, ETL/ELT processes, data modelling, and data quality -working closely with analysts, data scientists, and software engineers in a collaborative environment.
Key Responsibilities
• Design, develop, and maintain data infrastructure including data warehouses, data lakes, databases, and ETL pipelines
• Build efficient, scalable ETL/ELT pipelines to ingest data from multiple sources, including API‑based ingestion
• Develop custom API connectors in Python where required
• Design and implement data models to support business and analytical requirements
• Define data schemas, structures, and relationships between data entities
• Optimise data models and pipelines for performance, scalability, and reliability
• Collaborate with cross‑functional teams to understand data needs and ensure seamless data access and integration
• Support system integration efforts to ensure accurate and consistent data flow between platforms
• Implement and maintain data quality controls, validation processes, and governance frameworks
• Monitor data trends and analyse datasets to identify patterns, risks, and opportunities for improvement
• Document data engineering processes, data flows, and system configurations
• Clearly articulate complex technical concepts to both technical and non‑technical stakeholders
Nice to Have
• Experience with modern orchestration tools (e.g., Airflow, Dagster)
• Experience with big data technologies or streaming platforms
• Exposure to analytics, BI, or data science workflows
• Experience working in fast‑paced or scale‑up environments
If you tick all of the above boxes and can start ASAP, we’d love to hear from you.
Senior Data Engineer | Azure | Snowflake | Fivetran | DBT | ETL / ELT | APIs
Rate: £225 per day (Outside IR35)
Contract: Initially 9 months - likely to extend
Start: ASAP
Working model: Remote depending on location or Hybrid 1 day p/w
Working hours: UK (GMT)
Required: Azure, Fivetran (Airbyte or Airflow is considered), Snowflake, Python & APIs (Non Negotiable)
🚨 Non‑Negotiables - Please only apply if you have ALL of the following 🚨
Core technologies:
• Snowflake (data warehouse)
• Fivetran (data connectors)
• DBT (data transformation)
• Python (including building custom API connectors)
Environment:
Strong experience working in Azure‑based data environments
• Proven, hands‑on experience as a Data Engineer
• Strong experience building and maintaining ETL/ELT pipelines
• Solid understanding of data warehousing and data modelling concepts
Key skills:
• Experience with Fivetran, Snowflake, Python, and APIs
• Strong knowledge of connecting to APIs for data ingestion
• Familiarity with similar tools (Airbyte, Airflow, etc.) acceptable if you understand the core principles of API‑based ingestion
• Strong SQL skills and experience with structured and semi‑structured data
• Experience ensuring data quality, consistency, and governance
• Ability to work UK (GMT) hours
• High‑level written and spoken English, with the ability to communicate clearly with a global team
Role Overview
We’re looking for an experienced Senior Level Data Engineer to join a fast‑moving team responsible for building and scaling a modern, Azure‑native data platform. You’ll play a key role in designing, developing, and maintaining robust data infrastructure that supports analytics, reporting, and business decision‑making across the organisation.
This is a hands‑on role focused on data pipelines, ETL/ELT processes, data modelling, and data quality -working closely with analysts, data scientists, and software engineers in a collaborative environment.
Key Responsibilities
• Design, develop, and maintain data infrastructure including data warehouses, data lakes, databases, and ETL pipelines
• Build efficient, scalable ETL/ELT pipelines to ingest data from multiple sources, including API‑based ingestion
• Develop custom API connectors in Python where required
• Design and implement data models to support business and analytical requirements
• Define data schemas, structures, and relationships between data entities
• Optimise data models and pipelines for performance, scalability, and reliability
• Collaborate with cross‑functional teams to understand data needs and ensure seamless data access and integration
• Support system integration efforts to ensure accurate and consistent data flow between platforms
• Implement and maintain data quality controls, validation processes, and governance frameworks
• Monitor data trends and analyse datasets to identify patterns, risks, and opportunities for improvement
• Document data engineering processes, data flows, and system configurations
• Clearly articulate complex technical concepts to both technical and non‑technical stakeholders
Nice to Have
• Experience with modern orchestration tools (e.g., Airflow, Dagster)
• Experience with big data technologies or streaming platforms
• Exposure to analytics, BI, or data science workflows
• Experience working in fast‑paced or scale‑up environments
If you tick all of the above boxes and can start ASAP, we’d love to hear from you.






