AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a 6-month contract, offering a pay rate of "$XX/hour." Located in Reston, VA, it requires expertise in AWS services, big data tools, Python scripting, and infrastructure as code.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
576
-
πŸ—“οΈ - Date discovered
June 7, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Reston, VA
-
🧠 - Skills detailed
#Shell Scripting #EC2 #DevOps #Agile #AWS (Amazon Web Services) #Hadoop #Automation #Amazon EMR (Amazon Elastic MapReduce) #Terraform #Security #Data Processing #Lambda (AWS Lambda) #Documentation #Ansible #Big Data #Scripting #Scala #Data Engineering #Ab Initio #AWS Glue #Spark (Apache Spark) #Python #Informatica #Deployment #Compliance #Unix #PySpark #GitLab #"ETL (Extract #Transform #Load)" #Data Pipeline #Infrastructure as Code (IaC) #Cloud
Role description
JustinBradley’s client, a leading source of mortgage financing, is seeking a highly skilled AWS Data Engineer. The ideal candidate will have deep hands-on experience in building, automating, and supporting large-scale data and infrastructure solutions using AWS services and big data technologies. You will play a key role in designing and managing cloud-native data workflows, implementing infrastructure as code, and supporting mission-critical production environments in a rotating 24/7 capacity. Must be local to Reston, VA for their hybrid work environment. Responsibilities: β€’ Design, build, and manage scalable data pipelines using Spark, PySpark, AWS Glue, and Amazon EMR/Hadoop. β€’ Develop and maintain infrastructure as code using Terraform, CloudFormation, and GitLab CI/CD pipelines. β€’ Automate provisioning, configuration, and deployment of AWS services including EC2, SSM, CloudWatch, and Lambda. β€’ Write efficient Python and shell scripts to support automation and data processing needs. β€’ Provide production support, troubleshooting system issues, and ensuring high availability and performance. β€’ Collaborate with cross-functional teams to gather requirements and deliver technical solutions that meet business needs. β€’ Create and maintain detailed documentation and conduct technical presentations as needed. β€’ Follow security, compliance, and operational best practices within cloud environments. Requirements: β€’ Hands-on experience with core AWS services (EC2, SSM, CloudFormation, CloudWatch, Lambda). β€’ Strong experience with big data tools such as Spark, Amazon EMR, Hadoop, and AWS Glue. β€’ Proficiency in Python and Unix shell scripting. β€’ Hands-on experience building reusable Terraform modules and managing infrastructure using infrastructure as code. β€’ Experience automating operational workflows using tools like GitLab, Ansible, and Python. β€’ Proven ability to support and troubleshoot production environments in a 24/7 rotation. β€’ Excellent communication, time management, and technical presentation skills. Nice to Have: β€’ Experience with ETL tools such as Informatica or Ab Initio. β€’ AWS Certification (e.g., AWS Certified Data Analytics, Solutions Architect). β€’ Experience in Agile or DevOps environments. JustinBradley is an EO employer - Veterans/Disabled and other protected employees.