

Senior AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer on a 12-month contract, fully remote. Requires 8+ years of experience with AWS Glue, DynamoDB, Spark, Python, and Terraform. Key skills include ETL processes, data security, and cloud computing expertise.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Python #Terraform #Java #Programming #Scripting #Spark (Apache Spark) #SQL (Structured Query Language) #Data Extraction #Data Quality #Airflow #AWS (Amazon Web Services) #Data Engineering #Redshift #AWS Glue #Lambda (AWS Lambda) #Data Science #"ETL (Extract #Transform #Load)" #Data Pipeline #Database Design #Data Security #Security #DynamoDB #PySpark #Cloud #S3 (Amazon Simple Storage Service)
Role description
Fully Remote - Immediate interviews for Senior AWS Data Engineer - 12 months contract
Must have - DynamoDB, Spark , AWS Glue , Airflow, Lake formation, Lambda, Python , PySpark, Terraform CI/CD
Responsibilities:
Collaborate closely with cross-functional teams including Data Scientists, Analysts, and Software Engineers to understand data requirements and translate them into efficient solutions using AWS Glue and AWS services.
Develop and maintain ETL processes using AWS Glue to facilitate seamless and reliable data extraction, transformation, and loading.
Implement robust data security measures and access controls in alignment with company policies and industry best practices.
Monitor, troubleshoot, and enhance data pipelines, identifying and resolving performance bottlenecks, data quality issues, and other challenges.
Stay up-to-date with the latest advancements in AWS Glue and AWS services, and advocate for their effective utilization within the organization
Experience/Minimum Requirements
Proven experience 8+ years as a Data Engineer, with a strong emphasis on AWS Glue and AWS services.
In-depth understanding of architecture, performance optimization techniques, and best practices.
Proficiency in SQL and experience with database design principles.
Hands-on expertise in designing, building, and maintaining complex ETL pipelines using and AWS Glue.
Familiarity with data warehousing concepts and methodologies.
Competence in cloud computing and AWS services, with a focus on data-related services such as S3, Redshift, and Lambda.
Proficiency in scripting and programming languages such as Python, Java, or similar.
Fully Remote - Immediate interviews for Senior AWS Data Engineer - 12 months contract
Must have - DynamoDB, Spark , AWS Glue , Airflow, Lake formation, Lambda, Python , PySpark, Terraform CI/CD
Responsibilities:
Collaborate closely with cross-functional teams including Data Scientists, Analysts, and Software Engineers to understand data requirements and translate them into efficient solutions using AWS Glue and AWS services.
Develop and maintain ETL processes using AWS Glue to facilitate seamless and reliable data extraction, transformation, and loading.
Implement robust data security measures and access controls in alignment with company policies and industry best practices.
Monitor, troubleshoot, and enhance data pipelines, identifying and resolving performance bottlenecks, data quality issues, and other challenges.
Stay up-to-date with the latest advancements in AWS Glue and AWS services, and advocate for their effective utilization within the organization
Experience/Minimum Requirements
Proven experience 8+ years as a Data Engineer, with a strong emphasis on AWS Glue and AWS services.
In-depth understanding of architecture, performance optimization techniques, and best practices.
Proficiency in SQL and experience with database design principles.
Hands-on expertise in designing, building, and maintaining complex ETL pipelines using and AWS Glue.
Familiarity with data warehousing concepts and methodologies.
Competence in cloud computing and AWS services, with a focus on data-related services such as S3, Redshift, and Lambda.
Proficiency in scripting and programming languages such as Python, Java, or similar.