

ETL Developer (Python & Airflow) - W2 Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (Python & Airflow) on a W2 contract, requiring strong Python and Apache Airflow skills. Key responsibilities include developing ETL pipelines and optimizing workflows. Experience with SQL, cloud platforms, and data modeling is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, TX
-
π§ - Skills detailed
#Data Modeling #Pandas #Python #GIT #Kubernetes #Cloud #Monitoring #Data Warehouse #Data Pipeline #Databases #Snowflake #Documentation #Storage #Airflow #Apache Airflow #Data Engineering #SQLAlchemy #"ETL (Extract #Transform #Load)" #Observability #Docker #Azure #BigQuery #Data Quality #Scala #AWS (Amazon Web Services) #Libraries #Redshift #GCP (Google Cloud Platform)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Integrass, is seeking the following. Apply via Dice today!
We are seeking a skilled ETL Developer with strong expertise in Python and Apache Airflow to design, build, and optimize scalable data pipelines. In this role, you will develop production-ready workflows, ensure pipeline reliability, and collaborate with data teams to support business needs.
Responsibilities
β’ Develop, test, and deploy Python-based ETL pipelines using Apache Airflow.
β’ Write efficient, reusable Python scripts for transformations, validations, and data quality checks.
β’ Manage scheduling, orchestration, and monitoring of workflows in Airflow.
β’ Collaborate with data engineers and analysts to design pipelines aligned with business requirements.
β’ Troubleshoot, optimize, and scale existing ETL jobs.
β’ Implement best practices for code quality, testing, and CI/CD
β’ Contribute to documentation, observability, and knowledge sharing.
Required Skills
β’ Strong experience with Python(pandas, SQLAlchemy, or similar libraries for ETL).
β’ Proficiency with Apache AirflowDAG design, task orchestration, and operators.
β’ Hands-on experience with dependency/environment management (pipenv, poetry, or conda).
β’ Solid knowledge of SQLand relational databases.
β’ Understanding of data modeling and transformation patterns(star schema, SCDs, etc.).
β’ Familiarity with Git workflowsand CI/CD pipelines.
β’ Ability to work independently and collaboratively in a fast-paced environment.
Nice to Have
β’ Cloud platform experience (AWS, Google Cloud Platform, or Azure) for pipelines and storage.
β’ Familiarity with containerization tools(Docker, Kubernetes).
β’ Exposure to data warehouses(Snowflake, BigQuery, Redshift).
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Integrass, is seeking the following. Apply via Dice today!
We are seeking a skilled ETL Developer with strong expertise in Python and Apache Airflow to design, build, and optimize scalable data pipelines. In this role, you will develop production-ready workflows, ensure pipeline reliability, and collaborate with data teams to support business needs.
Responsibilities
β’ Develop, test, and deploy Python-based ETL pipelines using Apache Airflow.
β’ Write efficient, reusable Python scripts for transformations, validations, and data quality checks.
β’ Manage scheduling, orchestration, and monitoring of workflows in Airflow.
β’ Collaborate with data engineers and analysts to design pipelines aligned with business requirements.
β’ Troubleshoot, optimize, and scale existing ETL jobs.
β’ Implement best practices for code quality, testing, and CI/CD
β’ Contribute to documentation, observability, and knowledge sharing.
Required Skills
β’ Strong experience with Python(pandas, SQLAlchemy, or similar libraries for ETL).
β’ Proficiency with Apache AirflowDAG design, task orchestration, and operators.
β’ Hands-on experience with dependency/environment management (pipenv, poetry, or conda).
β’ Solid knowledge of SQLand relational databases.
β’ Understanding of data modeling and transformation patterns(star schema, SCDs, etc.).
β’ Familiarity with Git workflowsand CI/CD pipelines.
β’ Ability to work independently and collaboratively in a fast-paced environment.
Nice to Have
β’ Cloud platform experience (AWS, Google Cloud Platform, or Azure) for pipelines and storage.
β’ Familiarity with containerization tools(Docker, Kubernetes).
β’ Exposure to data warehouses(Snowflake, BigQuery, Redshift).