Toucanberry Tech

Senior Data Engineer / Data Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer/Data Lead, offering a long-term, fully remote contract at £600 - £650/day. Requires 4+ years in data engineering, strong SQL, experience with BigQuery, and financial markets expertise.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
650
-
🗓️ - Date
January 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#dbt (data build tool) #Data Visualisation #Data Warehouse #Cloud #Data Mapping #Python #Storage #Data Pipeline #Data Quality #BI (Business Intelligence) #Data Engineering #Data Processing #Snowflake #BigQuery #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Automation #Data Architecture #GCP (Google Cloud Platform) #Looker #Scala #Power Automate
Role description
Delivering scalable data infrastructure that transforms manual financial and actuarial processes into automated, auditable workflows. To deliver robust, scalable backend systems and data infrastructure that power automated data pipelines and dashboards for our global life reinsurance client in Bermuda. Hands on Experience: • Designing and implementing data warehouse schemas in cloud environments (BigQuery preferred) • Building reconciliation and data quality frameworks for financial data • Creating dashboards and data visualisations for business stakeholders • Working with financial instrument identifiers (ISIN, CUSIP), rating agencies, and cross-system data mapping Requirements Must Have Experience: • 4+ years experience in data engineering or data architecture roles • Strong SQL skills with experience designing data warehouse schemas (BigQuery, Snowflake, or similar) • Proven experience with dbt for data transformation workflows • Asset management or financial markets experience (e.g. investment accounting, fund administration, portfolio management) • Experience building reconciliation logic and data quality frameworks • Proficiency with BI tools (Looker, Looker Studio, or similar) Bonus (the nice to haves): • Insurance or reinsurance industry experience • Knowledge of Clearwater, Aladdin, and other finance systems • Familiarity with regulatory reporting (IFRS 17, BSCR, Solvency II) • Python proficiency for data processing and automation • Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Run) Required Behaviours/Characteristics: • Curious: Proactively explores problems and technologies; asks good questions • Hard-working: Delivers reliably and takes ownership of outcomes • Adaptable: Comfortable with ambiguity and evolving requirements; learns new domains quickly • Collaborative: Works effectively with data engineers, analysts, and client stakeholders • Quality-focused: Writes clean, tested, maintainable code Benefits • Fully Remote • Outside IR35 • Long term contract • £600 - £650/Day