

SRS Consulting Inc
Python AWS Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python AWS Developer in Dallas, TX, requiring 12+ years of experience. Pay is W2. Key skills include 5–7+ years with Python, AWS (Lambda, S3), cloud architecture, CI/CD, and experience with data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Databricks #Data Lake #Lambda (AWS Lambda) #Python #GIT #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Data Pipeline #Cloud #Version Control
Role description
Python AWS Developer
Location: Dallas, TX (Local to Texas)
Need 12+ years candidates - Only W2
Requirements:
• 5–7+ years of hands-on experience with Python as your primary language
• Moderate to strong experience with AWS services, especially:
• Lambda
• Step Functions
• S3
• Glue (nice to have)
• Experience building and supporting cloud-based, event-driven workflows
• Comfort working with large, sometimes legacy, codebases and optimizing performance
• Strong understanding of cloud-native design and modern application practices
• Knowledge of cloud architecture best practices
• Familiarity with CI/CD and version control (e.g., Git, CodePipeline)
• Ability to work independently and communicate effectively in a remote team
• Must have great collaboration skills, personality fit and excellent communication skills
• Experience with data pipelines or file ingestion systems
• Exposure to Data Lake architectures or tools like Databricks (not required, but helpful)
Python AWS Developer
Location: Dallas, TX (Local to Texas)
Need 12+ years candidates - Only W2
Requirements:
• 5–7+ years of hands-on experience with Python as your primary language
• Moderate to strong experience with AWS services, especially:
• Lambda
• Step Functions
• S3
• Glue (nice to have)
• Experience building and supporting cloud-based, event-driven workflows
• Comfort working with large, sometimes legacy, codebases and optimizing performance
• Strong understanding of cloud-native design and modern application practices
• Knowledge of cloud architecture best practices
• Familiarity with CI/CD and version control (e.g., Git, CodePipeline)
• Ability to work independently and communicate effectively in a remote team
• Must have great collaboration skills, personality fit and excellent communication skills
• Experience with data pipelines or file ingestion systems
• Exposure to Data Lake architectures or tools like Databricks (not required, but helpful)





