

Data Engineer - ETRM
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience in the ETRM domain. Contract length is unspecified, with a pay rate of "unknown" and remote work location. Key skills include Azure Data Factory, Snowflake, Python, and Fivetran.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Quality #ADF (Azure Data Factory) #Data Engineering #Data Processing #Data Pipeline #Data Accuracy #Data Lake #Data Storage #Data Replication #Databricks #Snowflake #Data Integration #Computer Science #Data Architecture #Fivetran #Azure Data Factory #Python #Storage #"ETL (Extract #Transform #Load)" #Scala #Azure #Replication
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer
Experience: 10+ years
Key Responsibilities:
β’ Design, develop, and maintain scalable data pipelines and ETRM systems.
β’ Implement real-time data replication from various source systems to Snowflake using Fivetran.
β’ Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain.
β’ Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur.
β’ Optimize and manage data storage solutions in Data Lake and Snowflake.
β’ Develop and maintain ETL processes using Azure Data Factory and Databricks.
β’ Write efficient and maintainable code in Python for data processing and analysis.
β’ Ensure data quality and integrity across various data sources and platforms.
β’ Develop and maintain data pipelines using FiveTran, Snowflake, and Azure Data Factory (ADF).
β’ Ensure data accuracy, integrity, and availability across various trading systems.
β’ Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions.
β’ Optimize and enhance data architecture for performance and scalability
Mandatory Skills:
β’ Azure Data Factory (ADF)
β’ Data Lake
β’ Snowflake
β’ Python
β’ Databricks
β’ Fivetran
β’ Experience in ETRM domain
β’ Integration with trading systems like Allegro, RightAngle, Endur
Preferred Qualifications:
β’ Strong problem-solving skills and attention to detail.
β’ Excellent communication and collaboration skills.
β’ Ability to work in a fast-paced and dynamic environment.
β’ Experience with other data integration tools and technologies is a plus.
Education:
β’ Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
β’ Certification in relevant technologies is a plus.