

Technopride
AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer on a 12-month fixed-term contract, hybrid in London, with a pay rate of £70,000-£75,000 per year. Key skills include Python, Apache Spark, AWS services, and data engineering fundamentals.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
340
-
🗓️ - Date
January 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Apache Spark #Observability #Jenkins #AWS Glue #Agile #Apache Iceberg #AWS (Amazon Web Services) #Data Pipeline #Data Processing #Programming #GitHub #Version Control #Code Reviews #GitLab #Batch #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Python #Pytest #Scala #Data Engineering #S3 (Amazon Simple Storage Service) #Data Quality #Lambda (AWS Lambda) #Automated Testing
Role description
Hi,We are looking for AWS Data Engineer for one of our renowned IT client based in London, UK. If interested, please share your updated CV to discuss further on the role.
Role: AWS Data Engineer
Location: London, UK
Is it Permanent / Contract: Fixed term contract for 12 months
Is it Onsite/Remote/Hybrid: Hybrid – 2 days a week to office.
Start Date: Immediate
No. of Positions: 1
Job description
We are seeking an experienced AWS Data Engineer to design, build, and maintain scalable data platforms on AWS. This role will focus on developing reliable, testable, and observable data pipelines using modern data engineering and software development practices.
The successful candidate will work closely with business and technical stakeholders to translate requirements into robust, data-driven solutions, while contributing to the evolution of a modern lakehouse architecture.
Key Responsibilities
Design and develop scalable, testable data pipelines using Python and Apache Spark.
Orchestrate and manage data workflows using AWS services such as AWS Glue, EMR Serverless, Lambda, and Amazon S3.
Apply modern software engineering practices including version control, CI/CD pipelines, modular design, and automated testing.
Contribute to the development and evolution of a lakehouse architecture using Apache Iceberg or similar table formats.
Collaborate with business stakeholders to understand requirements and deliver effective data-driven solutions.
Build observability into data pipelines and implement basic data quality checks.
Participate in code reviews, pair programming, and architectural discussions to promote best practices.
Continuously build domain knowledge and share insights with the wider team.
Required Skills & Experience
Strong ability to write clean, maintainable Python code, ideally using type hints, linters, and testing frameworks such as pytest.
Solid understanding of data engineering fundamentals, including batch processing, schema evolution, and ETL pipeline design.
Hands-on experience with, or strong interest in learning, Apache Spark for large-scale data processing.
Practical experience with the AWS data ecosystem, including S3, Glue, Lambda, and EMR.
Experience working in Agile delivery environments and collaborating closely with cross-functional teams.
Strong communication skills and a willingness to understand business context and stakeholder needs.
Nice to Have
Experience with Apache Iceberg or similar open table formats.
Familiarity with CI/CD tools such as GitLab CI, Jenkins, or GitHub Actions.
Exposure to data quality frameworks such as Great Expectations or Deequ.
Interest in financial markets, index data, or investment analytics.
Working Style
Collaborative and team-oriented, with a strong appreciation for shared ownership and knowledge exchange.
Curious and proactive, with a continuous learning mindset.
Comfortable balancing technical excellence with business outcomes.
Job Type: Fixed term contractContract length: 12 months
Pay: £70,000.00-£75,000.00 per year
Work Location: Hybrid remote in London
Hi,We are looking for AWS Data Engineer for one of our renowned IT client based in London, UK. If interested, please share your updated CV to discuss further on the role.
Role: AWS Data Engineer
Location: London, UK
Is it Permanent / Contract: Fixed term contract for 12 months
Is it Onsite/Remote/Hybrid: Hybrid – 2 days a week to office.
Start Date: Immediate
No. of Positions: 1
Job description
We are seeking an experienced AWS Data Engineer to design, build, and maintain scalable data platforms on AWS. This role will focus on developing reliable, testable, and observable data pipelines using modern data engineering and software development practices.
The successful candidate will work closely with business and technical stakeholders to translate requirements into robust, data-driven solutions, while contributing to the evolution of a modern lakehouse architecture.
Key Responsibilities
Design and develop scalable, testable data pipelines using Python and Apache Spark.
Orchestrate and manage data workflows using AWS services such as AWS Glue, EMR Serverless, Lambda, and Amazon S3.
Apply modern software engineering practices including version control, CI/CD pipelines, modular design, and automated testing.
Contribute to the development and evolution of a lakehouse architecture using Apache Iceberg or similar table formats.
Collaborate with business stakeholders to understand requirements and deliver effective data-driven solutions.
Build observability into data pipelines and implement basic data quality checks.
Participate in code reviews, pair programming, and architectural discussions to promote best practices.
Continuously build domain knowledge and share insights with the wider team.
Required Skills & Experience
Strong ability to write clean, maintainable Python code, ideally using type hints, linters, and testing frameworks such as pytest.
Solid understanding of data engineering fundamentals, including batch processing, schema evolution, and ETL pipeline design.
Hands-on experience with, or strong interest in learning, Apache Spark for large-scale data processing.
Practical experience with the AWS data ecosystem, including S3, Glue, Lambda, and EMR.
Experience working in Agile delivery environments and collaborating closely with cross-functional teams.
Strong communication skills and a willingness to understand business context and stakeholder needs.
Nice to Have
Experience with Apache Iceberg or similar open table formats.
Familiarity with CI/CD tools such as GitLab CI, Jenkins, or GitHub Actions.
Exposure to data quality frameworks such as Great Expectations or Deequ.
Interest in financial markets, index data, or investment analytics.
Working Style
Collaborative and team-oriented, with a strong appreciation for shared ownership and knowledge exchange.
Curious and proactive, with a continuous learning mindset.
Comfortable balancing technical excellence with business outcomes.
Job Type: Fixed term contractContract length: 12 months
Pay: £70,000.00-£75,000.00 per year
Work Location: Hybrid remote in London






