Natobotics

AWS Cloud Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Cloud Developer with Data Engineering skills, offering a hybrid work location. Contract length and pay rate are unspecified. Key requirements include proficiency in AWS services, Python, and experience with ETL processes and data modeling.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 21, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Surrey, England, United Kingdom
-
🧠 - Skills detailed
#GitHub #Kubernetes #Microsoft Power BI #Security #S3 (Amazon Simple Storage Service) #Computer Science #"ETL (Extract #Transform #Load)" #Data Quality #Scala #REST API #RDS (Amazon Relational Database Service) #Terraform #Infrastructure as Code (IaC) #AWS (Amazon Web Services) #Data Pipeline #Agile #Data Engineering #Python #Batch #Automation #NoSQL #Documentation #Data Storage #EC2 #BI (Business Intelligence) #Data Modeling #GIT #REST (Representational State Transfer) #Lambda (AWS Lambda) #Cloud #Databases #Storage #DevOps
Role description
Job Title: AWS Cloud Developer with Data Engineering Skills Location: Guildford Business Park Guildford Surrey GU2 8XG Is it Onsite/Remote/Hybrid: Hybrid Job Description As an AWS Cloud Developer with Data Engineering skills, you will be responsible for designing, developing, and maintaining scalable data solutions on the AWS platform. You will work closely with cross-functional teams to ensure seamless integration and optimal performance of our data engineering projects. Key Responsibilities β€’ Design, develop, and implement data pipelines and ETL processes using AWS services. β€’ Collaborate directly with our FinOps teams and market units across the organization to understand data requirements and deliver solutions that meet business needs. β€’ Optimize and manage data storage solutions using AWS services such as S3, RDS, and NoSQL databases. β€’ Ensure data quality, integrity, and security across all data engineering projects. β€’ Monitor and troubleshoot data workflows to ensure high availability and performance. β€’ Design and build advanced and interactive dashboards using tools such as AWS QuickSight and Power BI. β€’ Create and oversee a cloud billing dashboard to track, manage and optimize cloud costs and Reserved Instance purchases β€’ Build a dashboard that provides secure self‐service capabilities to all teams on cloud spend β€’ Knowledge of DevOps practices and CI/CD pipelines. β€’ Stay updated with the latest AWS technologies and best practices to continuously improve our data infrastructure. β€’ Solve technical problems and create viable tooling β€’ Design and implement shared services in cloud infrastructure. β€’ Use best appropriate infrastructure automation tools to provision cloud infrastructure components β€’ Attend important Agile events and finish assigned work packages / tasks β€’ Ensure smooth handover of project deliverables to internal and external customers β€’ Actively contributing to internal projects such as tooling and documentation β€’ Mentor new team member β€’ Promote the use of automation to solve technical challenges Skills β€’ Bachelor's degree in computer science, Information Technology, or a related field. β€’ Proven experience as a Cloud Developer or Data Engineer, with a focus on AWS. β€’ Strong proficiency in AWS services such as EKS, EC2, S3, Lambda, Glue. β€’ Solid understanding of data modeling, ETL processes, and data warehousing concepts. β€’ Strong proficiency in Python β€’ Familiarity with infrastructure as code tools like CloudFormation or Terraform. β€’ Excellent problem-solving skills and attention to detail. β€’ Strong communication and collaboration skills. β€’ Experience in building and utilizing REST APIs. β€’ Experience building and running Kubernetes Background Tasks with Batch-Jobs. β€’ Hands on experience Git-Hub Action