

Contract Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Contract Data Engineer, lasting 6 months with a pay rate of £600-£650 outside IR35. Remote work is available, with occasional onsite presence in London. Requires 5+ years of Data Engineering experience, proficiency in Python, AWS or GCP, and Terraform.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
650
-
🗓️ - Date discovered
July 15, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Monitoring #Security #Data Warehouse #Apache Airflow #Scala #"ETL (Extract #Transform #Load)" #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Airflow #SQL (Structured Query Language) #Data Modeling #S3 (Amazon Simple Storage Service) #Data Orchestration #Databases #Data Engineering #Docker #Consulting #Observability #GIT #Data Science #Version Control #BigQuery #Data Quality #Data Analysis #Redshift #Dataflow #Scripting #Data Processing #BI (Business Intelligence) #Data Governance #Python #AWS (Amazon Web Services) #Data Pipeline #Terraform #Kubernetes #Cloud #GCP (Google Cloud Platform) #Data Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Engineer - Contract
Location: Remote / Onsite on occasions in London
Contract Type: 6 months (with possibility of extension)
Start Date: Immediate / Flexible
Rate: £600-£650 outside IR35
Job Summary
We are seeking a skilled Data Engineer to join our team and support the design, development, and optimisation of scalable data pipelines and infrastructure. The ideal candidate has strong experience in cloud platforms (AWS or GCP), data orchestration tools, and infrastructure-as-code. This role is critical in ensuring high-quality data delivery and enabling advanced analytics and business intelligence initiatives.
Key Responsibilities
• Design, build, and maintain scalable and reliable data pipelines using Python.
• Develop and manage workflows and DAGs in Apache Airflow.
• Work with AWS or GCP to manage cloud-based data infrastructure.
• Use Terraform to provision, manage, and version cloud infrastructure.
• Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver solutions.
• Ensure data quality, monitoring, and observability across pipelines.
• Optimize performance and cost-effectiveness of data solutions.
• Implement data governance and security best practices.
Required Skills & Experience
• 5+ years of experience in Data Engineering or related field.
• Proficient in Python for data processing and scripting.
• Hands-on experience with cloud platforms: AWS (e.g., S3, Glue, Lambda, Redshift) or GCP (e.g., BigQuery, Cloud Functions, Dataflow).
• Experience building and scheduling pipelines using Apache Airflow.
• Proficient with Terraform for managing infrastructure as code.
• Strong understanding of SQL and relational databases.
• Experience working with CI/CD workflows and version control systems (e.g., Git).
• Familiarity with data modeling, ETL best practices, and performance optimization.
Preferred Qualifications
• Experience with containerization tools such as Docker.
• Familiarity with Kubernetes and cloud-native services.
• Knowledge of data lake and data warehouse architecture patterns.
• Prior experience in a contract or consulting role.