

AWS Cloud Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Engineer, 12+ months contract, hybrid (Reston, VA or Plano, TX), paying "pay rate." Requires 3+ years in data engineering, AWS, Python, Unix/Linux, SQL, and Agile. AWS certification preferred; cybersecurity experience is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
August 6, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#SQL Queries #AWS (Amazon Web Services) #ML (Machine Learning) #Terraform #Scala #Scrum #SQL (Structured Query Language) #Agile #Databases #Shell Scripting #Scripting #Security #Monitoring #Cloud #"ETL (Extract #Transform #Load)" #Data Ingestion #Computer Science #S3 (Amazon Simple Storage Service) #Data Pipeline #Redshift #Data Engineering #Cybersecurity #GitHub #Python #Logging #Version Control #Data Science #Lambda (AWS Lambda) #Compliance #DevOps #Data Governance #Automation #Unix #Linux #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role Title: Senior AWS Engineer
Location: Reston, VA or Plano, TX(Hybrid 3X)
Duration: 12+ Months Contract
Interview: Web-Ex
ADDITIONAL DETAILS:
β’ Preferably CTH opportunity
β’ 3-6 year of Python
β’ AWS experience
β’ Unix/Linux experience
β’ Agile methodology
β’ Nice to have: devops experience, GitHub, Terraform, AWS certification
β’ This team is in cybersecurity space, they support critical infosec workflows from the cyber datalake
β’ Key Responsibilities:
β’ Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
β’ Develop robust Python scripts for data ingestion, transformation, and automation.
β’ Write and optimize complex SQL queries for ETL and analytics workflows.
β’ Operate in Unix/Linux environments for scripting, automation, and system-level data operations.
β’ Participate in Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and contribute to iterative delivery of data solutions.
β’ Collaborate with cross-functional teams to gather requirements and translate them into high-level architecture and design documents.
β’ Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
β’ Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
β’ Apply DevOps best practices using GitHub, Terraform, and CloudFormation for infrastructure automation and CI/CD.
Required Qualifications:
β’ Bachelorβs or Masterβs degree in Computer Science, Engineering, or a related field.
β’ 3+ years of experience in data engineering or a similar role.
β’ Strong hands-on experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
β’ Advanced proficiency in Python for scripting and automation.
β’ Solid experience with Unix/Linux shell scripting.
β’ Strong command of SQL and experience with relational databases.
β’ Proficiency with GitHub for version control and collaboration.
β’ Experience with Terraform and/or AWS CloudFormation for infrastructure-as-code.
β’ Experience working in Agile/Scrum environments.
β’ Excellent verbal and written communication skills.
β’ Proven ability to contribute to high-level solution design and architecture discussions.
β’ AWS Certification (e.g., AWS Certified Data Analytics β Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications:
β’ Exposure to machine learning pipelines or data science workflows.
β’ Experience with data governance, security, and compliance best practices