

Python Developer with Airflow
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer with Airflow, hybrid in Reston, VA & Jersey City, NJ, on a full-time contract for over 6 months. Requires 10+ years Python, 3+ years Airflow, Oracle 19c expertise, and experience with job schedulers.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 25, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Jersey City, NJ
-
π§ - Skills detailed
#Automation #Data Quality #Oracle #Triggers #Databases #Documentation #Data Lineage #REST (Representational State Transfer) #Jenkins #Monitoring #Code Reviews #API (Application Programming Interface) #Perl #Process Automation #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Airflow #Data Governance #Apache Airflow #BitBucket #Debugging #REST API #GIT #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Visionary Innovative Technology Solutions, is seeking the following. Apply via Dice today!
Role: Python Developer with Airflow
Location: Reston, VA & Jersey City, NJ (Hybrid)
Duration: Full Time/Contract
We re seeking an experienced Python Developer to lead the automation and orchestration of complex data workflows. The ideal candidate will have hands-on experience designing robust, fault-tolerant, and auditable pipelines across on-prem Oracle systems, integrating with job schedulers like RunMyJobs, and modernizing legacy processes using Apache Airflow.
You will play a critical role in replacing legacy Perl/PLSQL scheduling logic with modern, Python-based DAG orchestration while ensuring traceability, data quality, and recoverability.
Key Responsibilities:
β’ Develop, deploy, and maintain Python-based automation scripts to orchestrate jobs across Oracle 19c on-prem systems. Design and implement Airflow DAGs to manage complex interdependent ETL workflows. Migrate existing job logic from Perl, RunMyJob, and PL/SQL-based scheduling into modular, observable Airflow DAGs. Build custom Airflow operators/sensors for integration with Oracle, REST APIs, file drops (SFTP/FTP), and external triggers. Implement robust error handling, alerting and retry mechanisms across job pipelines. Collaborate with DBAs and application teams to understand job dependencies, critical paths, and data lineage. Establish job execution logs, audit trails, and SLA monitoring dashboards. Participate in code reviews, documentation, and onboarding new jobs into the orchestrator. Required Skills and Experience: 10+ years of Python development experience, with strong understanding of system/process automation. 3+ years of Apache Airflow building production DAGs. Solid understanding of Oracle 19c database, SQL tuning, and PL/SQL concepts. Experience orchestrating jobs that move large volumes of data across enterprise systems. Familiarity with job schedulers (RunMyJob, Autosys, etc.) and how to replace/abstract them using orchestration tools. Strong debugging skills across logs, databases, and filesystem for failed jobs or partial runs. Experience building REST API integrations, SFTP/file movement logic, and parameter-driven automation.
Bonus / Preferred Experience:
β’ Prior experience modernizing legacy data workflows from Perl or PL/SQL stored procs. Hands-on knowledge of Git/Bitbucket, Jenkins, CI/CD pipelines for code-controlled job rollouts. Familiarity with financial data models (e.g., holdings, transactions, NAVs, tax lots). Basic understanding of data governance, audit, and operational risk in financial systems.