

Data Engineer (Teradata / Informatica)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Teradata/Informatica) for 3-9 months, paying up to £510 per day, located in West London. Key skills include Teradata, Informatica, SQL, Python, and cloud data pipeline experience, particularly in AI-driven projects.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
510
-
🗓️ - Date discovered
June 13, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Agile #Documentation #Data Engineering #DataOps #Airflow #AI (Artificial Intelligence) #dbt (data build tool) #AWS (Amazon Web Services) #Scala #Version Control #Data Quality #ML (Machine Learning) #Python #Snowflake #Data Pipeline #SQL (Structured Query Language) #Teradata #"ETL (Extract #Transform #Load)" #GraphQL #Terraform #Informatica #AWS Glue #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Description
Data Engineer (Teradata / Informatica)
Start: ASAP
Duration: 3-9 months
Pay: inside IR35, up to £510 per day
Location: West London, Hounslow area
Join a major data transformation programme that is modernising its data estate—blending robust legacy systems with cutting-edge, cloud-first architecture. This is not a standard lift-and-shift project; it’s a long-term, forward-looking initiative focused on AI-driven decision-making and operational excellence.
Key Responsibilities
• Design and maintain scalable, cloud-based data pipelines.
• Modernise legacy ETL processes using tools like Airflow, DBT, AWS Glue.
• Collaborate with architects and engineers to deliver high-quality, production-ready solutions.
• Optimise workflows and ensure data quality, reliability, and documentation.
• Champion best practices and a culture of continuous improvement.
Essential Skills
• Solid data engineering experience in both on-prem and cloud environments.
• Strong hands-on knowledge of Teradata and Informatica.
• Proven experience with SQL, Python, and large-scale data pipelines.
• Familiarity with data warehousing, modelling, and performance tuning.
• Experience using AWS and Agile working methods.
• CI/CD and version control proficiency.
Desirable
• Experience with Snowflake or other cloud-native warehouses.
• Exposure to GraphQL, DataOps, Terraform/CloudFormation.
• Interest in AI/ML integration in data engineering.
• Experience in enterprise-grade or regulated environments.