

Eden Scott
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12-month contract, offering a day rate outside IR35. Located in Perthshire with a hybrid work model, candidates should have strong SQL, Python, and Power BI skills, plus experience in enterprise data platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 29, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Perth, Scotland, United Kingdom
-
🧠 - Skills detailed
#Matillion #Observability #Oracle #SQL (Structured Query Language) #BI (Business Intelligence) #Microsoft Power BI #Cloud #Deployment #Data Engineering #Data Quality #Python #Airflow #Security #Leadership #Automation #dbt (data build tool) #Snowflake #"ETL (Extract #Transform #Load)" #Datasets #Semantic Models #GDPR (General Data Protection Regulation) #DAX #SaaS (Software as a Service) #ADF (Azure Data Factory)
Role description
Data Platform Engineer (12‑Month Day Rate Contract OUTSIDE IR35
Perthshire | Hybrid (1–2 days onsite per week)
Excellent opportunity for an innovative and highly capable Data Platform Engineer to support and accelerate a major transformation in the way they use and trust data.
This is a rare opportunity to build an enterprise‑grade data platform from the ground up, spanning Snowflake, Oracle SQL, Power BI and a mix of on‑prem and cloud services, while deeply embedded in the heart of a unique and multifaceted organisation.
This role blends technical engineering excellence with strategic influence. You will work across all levels of the business, including Ex‑Co, building strong relationships to truly understand our operations, commercial priorities, and strategic objectives.
Your work will directly help the organisation achieve its KPIs by ensuring data is reliable, trusted, and actionable.
The role:
Building the platform and the foundations for trust
• Play a lead role in architecting and building our new enterprise data platform
• Build strong partnerships across the organisation to deeply understand operations, goals, and data requirements
• Communicate confidently across all levels, including Executive leadership to ensure data is trusted, reliable, and aligned with business needs
Daily ownership & operational responsibilities
• Conduct overnight data checks and review script/data logs
• Manage incoming project requests, reports, automations, communications, and data‑related queries
• Publish curated, business‑ready datasets—tailored to leadership needs, not raw or unrefined
• Maintain stable, observable, and cost‑efficient pipelines with clear SLAs
Engineering excellence
• Design and develop robust ELT/ETL pipelines in Snowflake and Oracle using SQL, Python, and modern orchestration tools
• Deliver well‑modelled data products and certified Power BI assets
• Implement strong security, privacy, access controls and GDPR‑aligned governance
• Manage data quality, lineage, cataloguing, validation rules, retention and archiving
• Develop and maintain run‑books, dashboards, alerting and CI/CD workflows
Cross‑functional collaboration & supplier management
• Partner with analysts, super‑users and business leaders to convert data into meaningful decisions
• Coach and support teams on best‑practice analytics and optimisation
• Manage key suppliers and vendors involved in data and technology delivery
• Standardise integration patterns across systems, ensuring repeatable, reliable processes
Your experience
You will be someone who combines deep technical expertise with a solutions‑focused, relationship‑driven mindset.
You will have:
• Strong SQL (Snowflake + Oracle), Python and experience with dbt/Airflow/ADF/Matillion or similar
• Power BI expertise: semantic models, DAX optimisation, workspace governance, deployment pipelines
• Experience running production‑grade data platforms with high availability and observability
• Strong understanding of RBAC, masking, encryption, auditability and GDPR evidence requirements
• Familiarity with hybrid/on‑prem/cloud estates and SaaS integrations
• A positive, collaborative attitude and pride in delivering exceptional work
• Excellent communication skills and confidence interacting up to Ex‑Co level
• The ability to propose and drive solutions—not just surface technical problems
Data Platform Engineer (12‑Month Day Rate Contract OUTSIDE IR35
Perthshire | Hybrid (1–2 days onsite per week)
Excellent opportunity for an innovative and highly capable Data Platform Engineer to support and accelerate a major transformation in the way they use and trust data.
This is a rare opportunity to build an enterprise‑grade data platform from the ground up, spanning Snowflake, Oracle SQL, Power BI and a mix of on‑prem and cloud services, while deeply embedded in the heart of a unique and multifaceted organisation.
This role blends technical engineering excellence with strategic influence. You will work across all levels of the business, including Ex‑Co, building strong relationships to truly understand our operations, commercial priorities, and strategic objectives.
Your work will directly help the organisation achieve its KPIs by ensuring data is reliable, trusted, and actionable.
The role:
Building the platform and the foundations for trust
• Play a lead role in architecting and building our new enterprise data platform
• Build strong partnerships across the organisation to deeply understand operations, goals, and data requirements
• Communicate confidently across all levels, including Executive leadership to ensure data is trusted, reliable, and aligned with business needs
Daily ownership & operational responsibilities
• Conduct overnight data checks and review script/data logs
• Manage incoming project requests, reports, automations, communications, and data‑related queries
• Publish curated, business‑ready datasets—tailored to leadership needs, not raw or unrefined
• Maintain stable, observable, and cost‑efficient pipelines with clear SLAs
Engineering excellence
• Design and develop robust ELT/ETL pipelines in Snowflake and Oracle using SQL, Python, and modern orchestration tools
• Deliver well‑modelled data products and certified Power BI assets
• Implement strong security, privacy, access controls and GDPR‑aligned governance
• Manage data quality, lineage, cataloguing, validation rules, retention and archiving
• Develop and maintain run‑books, dashboards, alerting and CI/CD workflows
Cross‑functional collaboration & supplier management
• Partner with analysts, super‑users and business leaders to convert data into meaningful decisions
• Coach and support teams on best‑practice analytics and optimisation
• Manage key suppliers and vendors involved in data and technology delivery
• Standardise integration patterns across systems, ensuring repeatable, reliable processes
Your experience
You will be someone who combines deep technical expertise with a solutions‑focused, relationship‑driven mindset.
You will have:
• Strong SQL (Snowflake + Oracle), Python and experience with dbt/Airflow/ADF/Matillion or similar
• Power BI expertise: semantic models, DAX optimisation, workspace governance, deployment pipelines
• Experience running production‑grade data platforms with high availability and observability
• Strong understanding of RBAC, masking, encryption, auditability and GDPR evidence requirements
• Familiarity with hybrid/on‑prem/cloud estates and SaaS integrations
• A positive, collaborative attitude and pride in delivering exceptional work
• Excellent communication skills and confidence interacting up to Ex‑Co level
• The ability to propose and drive solutions—not just surface technical problems






