

AWS Infrastructure & Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Infrastructure & Data Engineer, contract length unspecified, with a pay rate of "unknown". It requires 5+ years of AWS and data engineering experience, proficiency in AWS services, scripting (Python), and Infrastructure as Code (Terraform).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 24, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Infrastructure as Code (IaC) #Cloud #Scripting #Big Data #Terraform #Data Lake #Docker #Data Pipeline #Jenkins #Redshift #Kafka (Apache Kafka) #Python #Snowflake #RDS (Amazon Relational Database Service) #GIT #Data Engineering #AWS Glue #Automation #Data Quality #Data Science #S3 (Amazon Simple Storage Service) #Version Control #Spark (Apache Spark) #Hadoop #Security #EC2 #Data Warehouse #Scala #GitHub #Kubernetes #Lambda (AWS Lambda) #AWS (Amazon Web Services) #BI (Business Intelligence)
Role description
Senior AWS Infrastructure and Data Engineer
We’re seeking a highly skilled Senior AWS Infrastructure and Data Engineer to join our team. In this role, you’ll be responsible for designing, building, and maintaining cloud-based data infrastructure and pipelines on AWS. Your work will ensure our data systems are scalable, secure, and efficient—empowering our analytics and business intelligence teams to extract meaningful insights.
We are not able to provide sponsorship.
The ideal candidate brings deep expertise in AWS services across infrastructure and data, a strong foundation in data engineering principles, and a passion for automation and best practices.
Responsibilities
• Design and architect scalable, reliable data infrastructure using AWS services such as EC2, S3, RDS, Redshift, and Lambda.
• Build and maintain robust ETL/ELT pipelines using AWS Glue, DataSync, and other relevant tools to ingest, transform, and load data from various sources.
• Implement and manage data lakes and data warehouses, ensuring data quality, governance, and security.
• Develop and deploy Infrastructure as Code (IaC) using Terraform or CloudFormation to automate AWS resource provisioning.
• Monitor and optimize performance, cost, and security of the AWS data environment.
• Collaborate with data scientists, analysts, and engineers to understand data needs and deliver effective solutions.
• Troubleshoot and resolve issues related to data pipelines, infrastructure, and performance.
• Stay current with the latest AWS services and industry best practices.
Requirements
• 5+ years of experience in a similar role with a strong focus on AWS and data engineering.
• Proven expertise in core AWS services including S3, EC2, Lambda, RDS, Redshift, Glue, and Step Functions.
• Proficiency in at least one scripting language (e.g., Python).
• Extensive experience with Infrastructure as Code (IaC), especially Terraform.
• Solid understanding of database concepts, data warehousing, and ETL/ELT processes.
• Familiarity with version control systems (e.g., Git).
• Strong problem-solving skills and ability to work independently and collaboratively.
• Excellent communication skills for working with both technical and non-technical stakeholders.
Nice to Have
• AWS Professional certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect – Professional).
• Experience with big data technologies such as Spark, Hadoop, or Snowflake.
• Familiarity with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
• Knowledge of containerization technologies (e.g., Docker, Kubernetes).
• Experience with stream processing tools (e.g., Kinesis, Kafka).
Senior AWS Infrastructure and Data Engineer
We’re seeking a highly skilled Senior AWS Infrastructure and Data Engineer to join our team. In this role, you’ll be responsible for designing, building, and maintaining cloud-based data infrastructure and pipelines on AWS. Your work will ensure our data systems are scalable, secure, and efficient—empowering our analytics and business intelligence teams to extract meaningful insights.
We are not able to provide sponsorship.
The ideal candidate brings deep expertise in AWS services across infrastructure and data, a strong foundation in data engineering principles, and a passion for automation and best practices.
Responsibilities
• Design and architect scalable, reliable data infrastructure using AWS services such as EC2, S3, RDS, Redshift, and Lambda.
• Build and maintain robust ETL/ELT pipelines using AWS Glue, DataSync, and other relevant tools to ingest, transform, and load data from various sources.
• Implement and manage data lakes and data warehouses, ensuring data quality, governance, and security.
• Develop and deploy Infrastructure as Code (IaC) using Terraform or CloudFormation to automate AWS resource provisioning.
• Monitor and optimize performance, cost, and security of the AWS data environment.
• Collaborate with data scientists, analysts, and engineers to understand data needs and deliver effective solutions.
• Troubleshoot and resolve issues related to data pipelines, infrastructure, and performance.
• Stay current with the latest AWS services and industry best practices.
Requirements
• 5+ years of experience in a similar role with a strong focus on AWS and data engineering.
• Proven expertise in core AWS services including S3, EC2, Lambda, RDS, Redshift, Glue, and Step Functions.
• Proficiency in at least one scripting language (e.g., Python).
• Extensive experience with Infrastructure as Code (IaC), especially Terraform.
• Solid understanding of database concepts, data warehousing, and ETL/ELT processes.
• Familiarity with version control systems (e.g., Git).
• Strong problem-solving skills and ability to work independently and collaboratively.
• Excellent communication skills for working with both technical and non-technical stakeholders.
Nice to Have
• AWS Professional certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect – Professional).
• Experience with big data technologies such as Spark, Hadoop, or Snowflake.
• Familiarity with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
• Knowledge of containerization technologies (e.g., Docker, Kubernetes).
• Experience with stream processing tools (e.g., Kinesis, Kafka).