

AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer, hybrid in Reston, VA or Plano, TX, lasting 6+ months. Pay rate is competitive. Requires 3+ years in data engineering, AWS services proficiency, advanced Python, SQL expertise, and AWS certification.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
August 15, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#Agile #DevOps #Automation #Version Control #Databases #Scrum #ML (Machine Learning) #Linux #Documentation #GitHub #Strategy #Shell Scripting #Monitoring #Data Pipeline #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Computer Science #Logging #Python #Cloud #Compliance #Data Governance #Data Science #Scripting #Scala #Security #Data Loss Prevention #Terraform #Data Engineering #S3 (Amazon Simple Storage Service) #Unix #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Ingestion #SQL Queries #Redshift
Role description
Position: Senior AWS Data Engineer
Work Location: Reston, VA and Plano, TX (2nd Preference) (Hybrid - 3 Days/week)
Duration: 06+ Months (Possible Extensions)
Job Description:
Key Responsibilities:
β’ Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
β’ Develop robust Python scripts for data ingestion, transformation, and automation.
β’ Write and optimize complex SQL queries for ETL and analytics workflows.
β’ Operate in Unix/Linux environments for scripting, automation, and system-level data operations.
β’ Participate in Agile ceremonies (daily stand-ups, sprint planning, and retrospectives) and contribute to iterative delivery of data solutions.
β’ Collaborate with cross-functional teams to gather requirements and translate them into high-level architecture and design documents.
β’ Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
β’ Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
β’ Apply DevOps best practices using GitHub, Terraform, and Cloud Formation for infrastructure automation and CI/CD.
Required Qualifications:
β’ Bachelors or Masterβs degree in Computer Science, Engineering, or a related field.
β’ 3+ years of experience in data engineering or a similar role.
β’ Strong hands-on experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
β’ Advanced proficiency in Python for scripting and automation.
β’ Solid experience with Unix/Linux shell scripting.
β’ Strong command of SQL and experience with relational databases.
β’ Proficiency with GitHub for version control and collaboration.
β’ Experience with Terraform and/or AWS Cloud Formation for infrastructure-as-code.
β’ Experience working in Agile/Scrum environments.
β’ Excellent verbal and written communication skills.
β’ Proven ability to contribute to high-level solution design and architecture discussions.
β’ AWS Certification (e.g., AWS Certified Data Analytics β Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications:
β’ Exposure to machine learning pipelines or data science workflows.
β’ Experience with data governance, security, and compliance best practices.
β’ Drive strategy and execution of Data Loss Prevention (DLP) Program and/or other relevant InfoSec programs in collaboration with a wide range of stakeholders.
β’ Provide DLP input on design and configuration of security controls across multiple capabilities including firewall, proxy, endpoint, and messaging. Assess and influence risk-based prioritizations for DLP and other security controls.
β’ Advise on and assist with security, data, and technology initiatives that impact the entire organization.
βMindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of β Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.β
Position: Senior AWS Data Engineer
Work Location: Reston, VA and Plano, TX (2nd Preference) (Hybrid - 3 Days/week)
Duration: 06+ Months (Possible Extensions)
Job Description:
Key Responsibilities:
β’ Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
β’ Develop robust Python scripts for data ingestion, transformation, and automation.
β’ Write and optimize complex SQL queries for ETL and analytics workflows.
β’ Operate in Unix/Linux environments for scripting, automation, and system-level data operations.
β’ Participate in Agile ceremonies (daily stand-ups, sprint planning, and retrospectives) and contribute to iterative delivery of data solutions.
β’ Collaborate with cross-functional teams to gather requirements and translate them into high-level architecture and design documents.
β’ Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
β’ Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
β’ Apply DevOps best practices using GitHub, Terraform, and Cloud Formation for infrastructure automation and CI/CD.
Required Qualifications:
β’ Bachelors or Masterβs degree in Computer Science, Engineering, or a related field.
β’ 3+ years of experience in data engineering or a similar role.
β’ Strong hands-on experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
β’ Advanced proficiency in Python for scripting and automation.
β’ Solid experience with Unix/Linux shell scripting.
β’ Strong command of SQL and experience with relational databases.
β’ Proficiency with GitHub for version control and collaboration.
β’ Experience with Terraform and/or AWS Cloud Formation for infrastructure-as-code.
β’ Experience working in Agile/Scrum environments.
β’ Excellent verbal and written communication skills.
β’ Proven ability to contribute to high-level solution design and architecture discussions.
β’ AWS Certification (e.g., AWS Certified Data Analytics β Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications:
β’ Exposure to machine learning pipelines or data science workflows.
β’ Experience with data governance, security, and compliance best practices.
β’ Drive strategy and execution of Data Loss Prevention (DLP) Program and/or other relevant InfoSec programs in collaboration with a wide range of stakeholders.
β’ Provide DLP input on design and configuration of security controls across multiple capabilities including firewall, proxy, endpoint, and messaging. Assess and influence risk-based prioritizations for DLP and other security controls.
β’ Advise on and assist with security, data, and technology initiatives that impact the entire organization.
βMindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of β Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.β