

Need Local: Python Developer with Airflow Exp
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer with Airflow experience, located onsite in Reston, VA. The contract is for over 6 months, offering a pay rate of "$XX/hour". Key skills include Python, AWS, Airflow, and Autosys; 9+ years of experience is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#Monitoring #Python #Data Pipeline #PostgreSQL #Java #Deployment #Code Reviews #Computer Science #Compliance #Kubernetes #Batch #GitLab #Version Control #NoSQL #Oracle #SQS (Simple Queue Service) #GIT #IAM (Identity and Access Management) #SNS (Simple Notification Service) #Leadership #Databases #Cloud #AWS (Amazon Web Services) #Scala #Jenkins #Logging #Docker #MongoDB #Security #Programming #Migration #Strategy #Automation #"JEE (Java Platform #Enterprise Edition)" #RDBMS (Relational Database Management System) #DynamoDB #S3 (Amazon Simple Storage Service) #Apache Airflow #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position: Python Developer with Airflow Exp
Location: Day One Onsite (Reston, VA)
Duration: Contract/ Full Time
Mandatory skills: Python, AWS and Airflow
Experience Level:
Overall Professional Experience: 9+ years
Apache Airflow Specific Experience: 3+ years
Autosys Specific Experience: 3+ years
Job Summary:
Based in Virginia, this individual will play a key role in designing, implementing, and migrating batch processes from Autosys to Airflow.
Responsibilities:
1.MWAA Migration & Strategy Leadership:
1.Lead the end-to-end migration of existing Autosys workflows and jobs to AWS MWAA, including assessment, planning, re-platforming, testing, and validation support.
2.Develop comprehensive migration strategies, roadmaps, and execution plans, minimizing disruption to ongoing operations.
1. Design and implement robust, scalable, and secure batch pipelines within MWAA, translating Autosys concepts and logic into efficient Airflow DAGs.
1.Serve as the primary technical expert for the Autosys to MWAA migration, providing guidance and troubleshooting support.
1. Airflow & MWAA Expertise:
1. Architect, develop, migrate highly scalable, reliable, and efficient batch pipelines using Python and Airflow DAGs within MWAA.
1. Manage and optimize MWAA environments, including infrastructure setup, configuration, monitoring, and scaling.
1. Implement best practices for Airflow DAG development, testing, deployment, and version control (e.g., Git, CI/CD pipelines).
1. Troubleshoot and resolve complex issues related to Airflow DAGs, infrastructure, and performance within MWAA.
1. Autosys Legacy System Understanding:
1. Possess a strong understanding of Autosys concepts, job types (e.g., Command, File Watcher, Box jobs), dependencies, calendars, and monitoring.
1. Perform in-depth analysis of existing Autosys workflows to identify migration complexities and opportunities for optimization.
3.Collaborate with existing Autosys administrators and application teams to ensure accurate and complete migration of functionalities.
4.Team Leadership & Mentorship:
1.Provide technical leadership, guidance, and mentorship to a team of migration engineers, fostering a collaborative and high-performing environment focused on migration and cloud adoption.
1. Conduct code reviews, provide constructive feedback, and ensure adherence to coding standards and best practices.
1. Contribute to the professional development of team members through training, knowledge sharing, and coaching on Airflow, MWAA, and migration best practices.
1. Architecture & Operational Excellence:
1. Collaborate with SMEs, product owners, and other stakeholders to define data pipeline requirements and translate them into robust Airflow solutions.
1. Contribute to the overall batch process architecture strategy, identifying opportunities for optimization, automation, and innovation post-migration.
1. Implement robust monitoring, alerting, and logging solutions for Airflow DAGs and MWAA environments.
1. Ensure compliance with security best practices for process handling and access within Airflow and MWAA.
6.Expertise in AWS:
1.Hands on knowledge and expertise in S3, SQS, SNS, EKS, ECS Fargate, Step Functions
2.Evaluate, and Implement architecture options for implementing end to end batch processes on AWS
1. Qualifications:
1. Bachelorβs or masterβs degree in computer science, or a related quantitative field.
1. 10+ years of overall professional experience in Java/ JEE and RDBMS databases, and software development.
3.5+ years of hands-on experience in Springboot and Spring batch
1. 5+ years of hands-on experience specifically with Apache Airflow, including complex DAG development, custom operators/hooks, and plugin creation.
1. 3+ years of hands-on experience with Autosys, demonstrating a solid understanding of its features, job scheduling, and administration.
1. Proven track record of successfully leading and executing migrations from legacy scheduling tools (specifically Autosys) to cloud-native orchestration platforms like AWS MWAA.
1. Demonstrable expert-level experience with AWS Managed Workflows for Apache Airflow (MWAA), including environment setup, configuration, scaling, security, and troubleshooting.
1. 5+ years of hands-on experience in Python programming for Airflow DAG development.
1. Extensive experience with AWS cloud services, including but not limited to CloudWatch, and IAM.
1. Proven experience designing and implementing highly scalable and fault-tolerant batch pipelines.
1. Experience with relational and NoSQL databases (e.g., PostgreSQL, Oracle, DynamoDB, MongoDB).
1. Familiarity with CI/CD practices and tools (e.g., Jenkins, GitLab CI, AWS CodePipeline).
1. Strong problem-solving skills and the ability to diagnose and resolve complex technical issues, especially during migration.
1. Excellent communication, interpersonal, and leadership skills, with the ability to effectively collaborate with cross-functional teams and mentor junior engineers.
Preferred Qualifications:
1. AWS Certifications (e.g., AWS Certified Solutions Architect - Associate/Professional).
1. Experience with containerization technologies (Docker, Kubernetes).
3.Experience in the Financial services sector, particularly in environments with complex legacy systems.