

EC Markets
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "unknown." The position requires expertise in Snowflake, SQL, Python, and experience in financial services or trading environments. A degree in Computer Science or related field and 5–8 years of relevant experience are essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater London, England, United Kingdom
-
🧠 - Skills detailed
#Delta Lake #Data Lifecycle #Data Architecture #Data Lake #GDPR (General Data Protection Regulation) #Computer Science #Data Lakehouse #Data Integrity #Automation #Compliance #"ETL (Extract #Transform #Load)" #Data Analysis #SQL (Structured Query Language) #Python #BI (Business Intelligence) #Data Engineering #Security #Databases #Storage #Dimensional Modelling #CRM (Customer Relationship Management) #NoSQL #Data Governance #Data Lineage #Scripting #Documentation #Snowflake #Scala
Role description
Overview
EC Markets is building a next-generation data platform to power trading, operational, and marketing intelligence. We are seeking a Senior Data Engineer to design and build our Snowflake-based Data Lakehouse, establishing a modern data ecosystem that supports advanced analytics, compliance, and decision-making across the business.
This is a high-impact role for an experienced engineer with a track record of architecting and implementing scalable data platforms — ideally in financial or trading environments — who can lead the build-out of our data infrastructure from the ground up.
Key Responsibilities
Architecture & Development
• Design and build a Snowflake-centric Data Lakehouse integrating structured, semi-structured, and unstructured data.
• Develop robust ETL/ELT pipelines that ingest and transform data from multiple internal systems (trading, CRM, finance, risk, etc.) and external APIs.
• Implement data models, schemas, and transformation frameworks optimised for analytical and regulatory use cases.
• Apply best practices in data versioning, orchestration, and automation using modern data engineering tools.
• Ensure scalability, data lineage, and governance across the data lifecycle.
Data Governance & Quality
• Define and implement standards for data validation, cataloguing, and documentation.
• Maintain high data integrity, privacy, and security aligned with FCA and GDPR requirements.
• Monitor and optimise query performance and storage efficiency.
Cross-Functional Collaboration
• Partner with business units (Trading, Finance, Marketing, Compliance) to capture data requirements and translate them into robust technical solutions.
• Support regulatory, management, and operational reporting requirements through structured data models.
Key Objectives
• Define the core Snowflake-based architecture and data model.
• Build foundational ETL/ELT pipelines from primary systems and third-party data sources.
• Establish automated CI/CD workflows for data operations.
• Deliver the first version of the EC Markets Data Lakehouse to support BI and compliance analytics.
Skills & Experience
• Proven experience designing and delivering DWH / Delta Lakehouse using Snowflake.
• Strong SQL and data modelling expertise (star/snowflake schemas, dimension/al modelling).
• Experience integrating SQL/NoSQL databases and external APIs as data sources.
• Proficiency in Python or another scripting language.
• Familiarity with orchestration and transformation frameworks.
• Hands-on experience or a strong understanding of data analysis, visualisation, and operational reporting tools is highly desirable.
• Experience in financial services, trading, or fintech environments preferred.
• Excellent communication skills and ability to translate business requirements into scalable data architecture.
Qualifications
• Degree in Computer Science, Data Engineering, or related field.
• 5–8 years of hands-on experience in data engineering or infrastructure development.
Overview
EC Markets is building a next-generation data platform to power trading, operational, and marketing intelligence. We are seeking a Senior Data Engineer to design and build our Snowflake-based Data Lakehouse, establishing a modern data ecosystem that supports advanced analytics, compliance, and decision-making across the business.
This is a high-impact role for an experienced engineer with a track record of architecting and implementing scalable data platforms — ideally in financial or trading environments — who can lead the build-out of our data infrastructure from the ground up.
Key Responsibilities
Architecture & Development
• Design and build a Snowflake-centric Data Lakehouse integrating structured, semi-structured, and unstructured data.
• Develop robust ETL/ELT pipelines that ingest and transform data from multiple internal systems (trading, CRM, finance, risk, etc.) and external APIs.
• Implement data models, schemas, and transformation frameworks optimised for analytical and regulatory use cases.
• Apply best practices in data versioning, orchestration, and automation using modern data engineering tools.
• Ensure scalability, data lineage, and governance across the data lifecycle.
Data Governance & Quality
• Define and implement standards for data validation, cataloguing, and documentation.
• Maintain high data integrity, privacy, and security aligned with FCA and GDPR requirements.
• Monitor and optimise query performance and storage efficiency.
Cross-Functional Collaboration
• Partner with business units (Trading, Finance, Marketing, Compliance) to capture data requirements and translate them into robust technical solutions.
• Support regulatory, management, and operational reporting requirements through structured data models.
Key Objectives
• Define the core Snowflake-based architecture and data model.
• Build foundational ETL/ELT pipelines from primary systems and third-party data sources.
• Establish automated CI/CD workflows for data operations.
• Deliver the first version of the EC Markets Data Lakehouse to support BI and compliance analytics.
Skills & Experience
• Proven experience designing and delivering DWH / Delta Lakehouse using Snowflake.
• Strong SQL and data modelling expertise (star/snowflake schemas, dimension/al modelling).
• Experience integrating SQL/NoSQL databases and external APIs as data sources.
• Proficiency in Python or another scripting language.
• Familiarity with orchestration and transformation frameworks.
• Hands-on experience or a strong understanding of data analysis, visualisation, and operational reporting tools is highly desirable.
• Experience in financial services, trading, or fintech environments preferred.
• Excellent communication skills and ability to translate business requirements into scalable data architecture.
Qualifications
• Degree in Computer Science, Data Engineering, or related field.
• 5–8 years of hands-on experience in data engineering or infrastructure development.






