

Visionary Innovative Technology Solutions LLC
F2F Interview: Sr. AWS Python Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. AWS Python Developer in Virginia, offering a contract of over 6 months. Key skills include strong Python and PySpark experience, AWS services (Lambda, Glue, Redshift), SQL proficiency, and familiarity with CI/CD practices.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 5, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Virginia, United States
-
🧠 - Skills detailed
#Amazon Redshift #Automation #Scala #SQL (Structured Query Language) #DevOps #"ETL (Extract #Transform #Load)" #PySpark #Python #Storage #Data Transformations #Redshift #Version Control #Data Quality #GIT #SNS (Simple Notification Service) #Spark (Apache Spark) #Lambda (AWS Lambda) #Security #Deployment #Documentation #Infrastructure as Code (IaC) #Monitoring #AWS (Amazon Web Services) #AWS Glue #SQS (Simple Queue Service) #Data Storage #Cloud #Data Modeling #Data Processing
Role description
Position: Sr. AWS Python Developer
Location: Virginia (Onsite)
Duration: Contract / Full time
Job Description:
Seeking an experienced AWS Python Developer to design, build, and maintain cloud-native applications and automation on AWS. The ideal candidate will have strong Python skills, hands-on experience with core AWS services, and experience implementing Infrastructure as Code and CI/CD pipelines. This role collaborates with product, DevOps, and security teams to deliver reliable, scalable, and cost-efficient solutions.
Key Responsibilities
• Build and maintain ETL pipelines using Python and PySpark on AWS Glue and other compute platforms
• Orchestrate workflows with AWS Step Functions and serverless components (Lambda)
• Implement messaging and event-driven patterns using AWS SNS and SQS
• Design and optimize data storage and querying in Amazon Redshift
• Write performant SQL for data transformations, validation, and reporting
• Ensure data quality, monitoring, error handling and operational support for pipelines
• Collaborate with data consumers, engineers, and stakeholders to translate requirements into solutions
• Contribute to CI/CD, infrastructure-as-code, and documentation for reproducible deployments
Required skills and experience
• Strong experience with Python and Pyspark for large-scale data processing
• Proven hands-on experience with AWS services: Lambda, SNS, SQS, Glue, Redshift, Step Functions
• Solid SQLSQL skills and familiarity with data modeling and query optimization
• Experience with ETL best practices, data quality checks, and monitoring/alerting
• Familiarity with version control (Git) and basic DevOps/CI-CD workflows
Position: Sr. AWS Python Developer
Location: Virginia (Onsite)
Duration: Contract / Full time
Job Description:
Seeking an experienced AWS Python Developer to design, build, and maintain cloud-native applications and automation on AWS. The ideal candidate will have strong Python skills, hands-on experience with core AWS services, and experience implementing Infrastructure as Code and CI/CD pipelines. This role collaborates with product, DevOps, and security teams to deliver reliable, scalable, and cost-efficient solutions.
Key Responsibilities
• Build and maintain ETL pipelines using Python and PySpark on AWS Glue and other compute platforms
• Orchestrate workflows with AWS Step Functions and serverless components (Lambda)
• Implement messaging and event-driven patterns using AWS SNS and SQS
• Design and optimize data storage and querying in Amazon Redshift
• Write performant SQL for data transformations, validation, and reporting
• Ensure data quality, monitoring, error handling and operational support for pipelines
• Collaborate with data consumers, engineers, and stakeholders to translate requirements into solutions
• Contribute to CI/CD, infrastructure-as-code, and documentation for reproducible deployments
Required skills and experience
• Strong experience with Python and Pyspark for large-scale data processing
• Proven hands-on experience with AWS services: Lambda, SNS, SQS, Glue, Redshift, Step Functions
• Solid SQLSQL skills and familiarity with data modeling and query optimization
• Experience with ETL best practices, data quality checks, and monitoring/alerting
• Familiarity with version control (Git) and basic DevOps/CI-CD workflows






