Whitehall Resources

Data Engineer - SC Cleared

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Data Engineer with a contract length of unspecified duration, offering a pay rate that is inside IR35. It requires strong skills in Unix shell scripting, PL/SQL, Oracle SQL, and ETL concepts, with a focus on legacy data migration to a Data Lakehouse.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
Telford, England, United Kingdom
-
🧠 - Skills detailed
#Informatica #Data Engineering #Metadata #Migration #CLI (Command-Line Interface) #Documentation #Business Objects #Data Warehouse #Unix #Scripting #Data Lake #SQL (Structured Query Language) #Data Lakehouse #"ETL (Extract #Transform #Load)" #Oracle #Shell Scripting #BO (Business Objects)
Role description
Data Engineer - SC Cleared Whitehall Resources currently require an SC Cleared Data Engineer to work with a key client • • Please note this role requires 2 days onsite weekly in Telford and falls INSIDE IR35 • • • • Active SC Clearance Essential • • Role Overview: An experienced Data Engineer is required to support the team in delivering Informatica Advanced Scans across legacy data warehouses. This role is a key enabler for the strategic migration from legacy data warehouses to the new Data Lakehouse architecture. The successful candidate will work closely with stakeholders to establish a comprehensive and accurate as is view of each legacy warehouse, leveraging Informatica Advanced Scanning capabilities wherever possible. Key Responsibilities: • Perform detailed analysis of legacy data warehouses using Informatica Advanced Scanning techniques. • Produce a complete and accurate as is representation of each data warehouse. • Identify and document all inbound data feeds consumed by CDW, including: - Source/origin systems - Data formats - Full feed by feed data content • Capture and document the actual metadata held within the data warehouse, i-including: - Table structures - Field names and definitions • Identify and document all outbound data feeds from the data warehouse, including: - Consuming systems or platforms - Full feed by feed data content - Outbound metadata (tables, fields, and structures) • Where possible, generate bespoke outputs from Informatica Advanced Scanning for each legacy warehouse to support analysis and migration planning. • Collaborate with architecture, data, and migration teams to ensure outputs are aligned to Data Lakehouse migration objectives. • Ensure documentation is clear, accurate, and suitable for use in downstream migration and governance activities. Required Skills and Experience: • Strong experience with Unix shell scripting • PL/SQL and Oracle SQL • Maestro / TWS scheduling tools • ETL concepts and implementations • Command line tools (PuTTY / CLI) • File transfer tools (e.g. WinSCP) • Use of text and documentation tools (e.g. Notepad++, Word or equivalent) Desirable Knowledge and Experience: • SA Profiles extracts • SA MARTs • SA Warehouse structures and concepts • Caseflow and KBS • Authorete • Eureka Business Objects tooling: • BO Infoview • BO Developer • BO CMC • BO Import Wizard Additional Information: • This role is hands on and delivery focused, supporting a critical phase of legacy to Lakehouse migration. • Strong attention to detail, documentation quality, and stakeholder collaboration are essential