

Assala Energy
Petroleum Data Scientist
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Petroleum Data Scientist on a contract basis, focusing on data engineering, predictive analytics, and BI development. Requires a Master's in Computer Science, expertise in ETL, Python, SQL, and Power BI. Contract length and pay rate are unspecified.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#GIT #Databases #Dimensional Modelling #Compliance #Pandas #Data Quality #Python #AWS (Amazon Web Services) #Snowflake #Databricks #Quality Assurance #Monitoring #Data Management #Data Engineering #BI (Business Intelligence) #Data Enrichment #Datasets #Consulting #SQL (Structured Query Language) #ML (Machine Learning) #Storage #"ETL (Extract #Transform #Load)" #Azure #Microsoft Power BI #NumPy #Cloud #Computer Science #Data Science #Data Warehouse #Data Architecture
Role description
Assala Energy is a dynamic Oil and Gas Exploration and Production company committed to the sustainable development of its assets in Gabon. We value a collaborative approach, promote diversity, and prioritize safety and integrity in all our operations.
To support the creation and development of data‑driven initiatives and facilitate the transition of business processes toward a "Data‑Driven" methodology, the consultant will provide predictive analytics, data engineering, BI development, and data quality assurance.
This is a contract opportunity offered subject to IR35 compliance checks.
Service Overview
The consultant will:
• Deliver across end‑to‑end data management systems for BI projects
• Develop a global understanding of business processes to identify, support, and prioritise BI initiatives
• Provide technical expertise to our Operations, Finance, Logistics, and other departments
• Facilitate the organisation's transition toward a data‑driven culture and methodology
• Collaborate with the Data Engineer in developing architecture, processes, and data workflows
Scope of Work & Deliverables
Main Deliverables
• Propose decision‑support analysis solutions in collaboration with business teams
• Ensure QA/QC of all datasets used for BI, analytics and predictive modelling developments
• Identify and develop predictive systems to support the industrialisation, production, storage and maintenance of ML models
• Develop data enrichment pipelines and consolidate databases to ensure reliable data sources for reporting
• Collaborate with the Data Engineer to design and build the data platform infrastructure, data architecture and ETL processes
• Design and manage the company's Data Warehouse
• Create, maintain and develop reports, dashboards, KPIs, monitoring and visualisation tools, mainly using Power BI
• Deliver BI tools first for Operations and then extend support to other departments upon request and approval
• Integration of new asset data into existing reports
• Finance / Cost Control KPI Dashboard
• Daily & Weekly reporting standardisation and improvement
• Logistics KPI Dashboard (POB, passenger flows, goods movement)
Full scope of work with deliverables and objectives will be made available during a technical validation meeting.
Requirements
• Master's degree in Computer Science or equivalent field
• Experience in data engineering or data science consulting
• Strong expertise with ETL pipeline development and cloud-based environments (Azure, AWS, Databricks, Snowflake)
• Python (numpy, pandas, scikit‑learn)
• SQL, dimensional modelling
• Power BI
• Git, CI/CD
• VSCode or PyCharm
• Data Warehouse design & architecture
Profile
• Detail‑oriented, autonomous, and service‑focused
• Strong communication
• Curious, proactive, and collaborative
• Strong problem‑solving mindset and client‑focused approach
Assala Energy is a dynamic Oil and Gas Exploration and Production company committed to the sustainable development of its assets in Gabon. We value a collaborative approach, promote diversity, and prioritize safety and integrity in all our operations.
To support the creation and development of data‑driven initiatives and facilitate the transition of business processes toward a "Data‑Driven" methodology, the consultant will provide predictive analytics, data engineering, BI development, and data quality assurance.
This is a contract opportunity offered subject to IR35 compliance checks.
Service Overview
The consultant will:
• Deliver across end‑to‑end data management systems for BI projects
• Develop a global understanding of business processes to identify, support, and prioritise BI initiatives
• Provide technical expertise to our Operations, Finance, Logistics, and other departments
• Facilitate the organisation's transition toward a data‑driven culture and methodology
• Collaborate with the Data Engineer in developing architecture, processes, and data workflows
Scope of Work & Deliverables
Main Deliverables
• Propose decision‑support analysis solutions in collaboration with business teams
• Ensure QA/QC of all datasets used for BI, analytics and predictive modelling developments
• Identify and develop predictive systems to support the industrialisation, production, storage and maintenance of ML models
• Develop data enrichment pipelines and consolidate databases to ensure reliable data sources for reporting
• Collaborate with the Data Engineer to design and build the data platform infrastructure, data architecture and ETL processes
• Design and manage the company's Data Warehouse
• Create, maintain and develop reports, dashboards, KPIs, monitoring and visualisation tools, mainly using Power BI
• Deliver BI tools first for Operations and then extend support to other departments upon request and approval
• Integration of new asset data into existing reports
• Finance / Cost Control KPI Dashboard
• Daily & Weekly reporting standardisation and improvement
• Logistics KPI Dashboard (POB, passenger flows, goods movement)
Full scope of work with deliverables and objectives will be made available during a technical validation meeting.
Requirements
• Master's degree in Computer Science or equivalent field
• Experience in data engineering or data science consulting
• Strong expertise with ETL pipeline development and cloud-based environments (Azure, AWS, Databricks, Snowflake)
• Python (numpy, pandas, scikit‑learn)
• SQL, dimensional modelling
• Power BI
• Git, CI/CD
• VSCode or PyCharm
• Data Warehouse design & architecture
Profile
• Detail‑oriented, autonomous, and service‑focused
• Strong communication
• Curious, proactive, and collaborative
• Strong problem‑solving mindset and client‑focused approach






