

iXceed Solutions
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Dublin on a 6-month contract, offering £80,000-£85,000 per day. Requires 5+ years in data engineering, strong expertise in Snowflake, Hadoop, Spark, Python, SQL, and experience with CI/CD and data testing methodologies.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
386
-
🗓️ - Date
November 5, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Edinburgh
-
🧠 - Skills detailed
#Data Governance #NoSQL #Automation #Databricks #Scala #Snowflake #Big Data #SQL (Structured Query Language) #Agile #Automated Testing #Data Architecture #Datasets #Kafka (Apache Kafka) #Data Management #"ETL (Extract #Transform #Load)" #Regression #Python #Data Transformations #Cloud #Data Science #Data Quality #dbt (data build tool) #Spark (Apache Spark) #Data Pipeline #NiFi (Apache NiFi) #Data Profiling #Quality Assurance #Anomaly Detection #Metadata #Hadoop #Data Engineering #Databases
Role description
Role: Senior Data Engineer
Location: Dublin
Mode: Hybrid
Type: Contract
Job Description:
Role Responsibilities
Data Engineering & Platform Development
Design, build, and maintain scalable data pipelines using Snowflake, Hadoop, Spark, NiFi, and related big data technologies.
Implement data architectures and optimize workflows for massive financial datasets.
Write high-quality, maintainable code in Python and SQL following best practices.
Integrate data governance principles, metadata management, and lineage tracking into solutions.
Data Quality Assurance & Testing
Develop automated testing frameworks and validation scripts for ETL processes and data transformations.
Implement data quality checks, reconciliation processes, and regression testing suites to ensure accuracy, completeness, and timeliness.
Perform unit, integration, and end-to-end testing for data pipelines and schema changes.
Use tools like dbt tests, custom Python utilities for automated validation.
Collaboration & Agile Delivery
Work closely with Data Engineers, Product, and Data Science teams to embed testing into the development lifecycle.
Participate in agile ceremonies (sprint planning, backlog refinement, retrospectives) with a focus on quality and delivery.
Support production incident response with rapid data validation and root cause analysis.
Continuous Improvement
Stay current with emerging data engineering and testing technologies.
Contribute to team knowledge sharing, mentoring junior engineers, and improving technical standards.
Shape best practices for data reliability, testing automation, and CI/CD integration.
Skills & Qualifications
Core Technical Expertise
Advanced SQL and experience with relational and NoSQL databases.
Strong experience with Snowflake, Hadoop, Spark, Databricks, Kafka, and cloud data platforms.
Proficiency in Python for both data engineering and test automation.
Familiarity with orchestration tools and workflow management systems.
Testing & Quality
Proven experience in data testing methodologies, ETL validation, and automated testing frameworks.
Knowledge of data profiling, anomaly detection, and statistical validation techniques.
Experience integrating testing into CI/CD pipelines.
Professional Attributes
Strong problem-solving and analytical skills with attention to detail.
Excellent communication skills for cross-functional collaboration.
Ability to work independently and manage multiple priorities in fast-paced environments.
Job Type: Fixed term contractContract length: 6 months
Pay: £80,000.00-£85,000.00 per day
Experience:
Data Engineering: 5 years (required)
Snowflake: 3 years (required)
Hadoop: 3 years (required)
Spark: 3 years (required)
NiFi: 2 years (required)
Python: 2 years (required)
SQL : 2 years (required)
Databricks: 3 years (required)
Kafka: 1 year (required)
Data testing methodologies: 3 years (required)
CI/CD pipelines: 2 years (required)
Role: Senior Data Engineer
Location: Dublin
Mode: Hybrid
Type: Contract
Job Description:
Role Responsibilities
Data Engineering & Platform Development
Design, build, and maintain scalable data pipelines using Snowflake, Hadoop, Spark, NiFi, and related big data technologies.
Implement data architectures and optimize workflows for massive financial datasets.
Write high-quality, maintainable code in Python and SQL following best practices.
Integrate data governance principles, metadata management, and lineage tracking into solutions.
Data Quality Assurance & Testing
Develop automated testing frameworks and validation scripts for ETL processes and data transformations.
Implement data quality checks, reconciliation processes, and regression testing suites to ensure accuracy, completeness, and timeliness.
Perform unit, integration, and end-to-end testing for data pipelines and schema changes.
Use tools like dbt tests, custom Python utilities for automated validation.
Collaboration & Agile Delivery
Work closely with Data Engineers, Product, and Data Science teams to embed testing into the development lifecycle.
Participate in agile ceremonies (sprint planning, backlog refinement, retrospectives) with a focus on quality and delivery.
Support production incident response with rapid data validation and root cause analysis.
Continuous Improvement
Stay current with emerging data engineering and testing technologies.
Contribute to team knowledge sharing, mentoring junior engineers, and improving technical standards.
Shape best practices for data reliability, testing automation, and CI/CD integration.
Skills & Qualifications
Core Technical Expertise
Advanced SQL and experience with relational and NoSQL databases.
Strong experience with Snowflake, Hadoop, Spark, Databricks, Kafka, and cloud data platforms.
Proficiency in Python for both data engineering and test automation.
Familiarity with orchestration tools and workflow management systems.
Testing & Quality
Proven experience in data testing methodologies, ETL validation, and automated testing frameworks.
Knowledge of data profiling, anomaly detection, and statistical validation techniques.
Experience integrating testing into CI/CD pipelines.
Professional Attributes
Strong problem-solving and analytical skills with attention to detail.
Excellent communication skills for cross-functional collaboration.
Ability to work independently and manage multiple priorities in fast-paced environments.
Job Type: Fixed term contractContract length: 6 months
Pay: £80,000.00-£85,000.00 per day
Experience:
Data Engineering: 5 years (required)
Snowflake: 3 years (required)
Hadoop: 3 years (required)
Spark: 3 years (required)
NiFi: 2 years (required)
Python: 2 years (required)
SQL : 2 years (required)
Databricks: 3 years (required)
Kafka: 1 year (required)
Data testing methodologies: 3 years (required)
CI/CD pipelines: 2 years (required)





