

Air Flow Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Air Flow Data Engineer in Austin, TX, for 12 months, paying $(57.00 - 62.00)/hr on W2 or $(62.00 - 71.00)/hr on C2C. Requires 5-7 years in Fabric & Azure Air Flow, Spark, and cloud-native architectures.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
568
-
🗓️ - Date discovered
September 24, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#Data Engineering #Scala #Airflow #Data Pipeline #"ETL (Extract #Transform #Load)" #Azure #Visualization #Monitoring #Data Architecture #Cloud #Spark (Apache Spark)
Role description
Job Title: Air Flow Data Engineer
Location: Austin, TX
Duration: 12 Months
Pay Range: $(57.00 - 62.00)/hr on W2 all-inclusive without benefits
Pay Range: $(62.00 - 71.00)/hr on C2C
Job Description:
· 5-7 years hands-on development experience in Fabric & Azure Air Flow
· Strong expertise in Spark and Airflow for building scalable, distributed data pipelines.
· Solid experience in Azure services and cloud-native architectures.
· Proven ability to design and implement end-to-end pipelines, including error handling, monitoring, alerting, and self-healing capabilities.
· Good understanding of data architecture principles, best practices for ingestion, transformation, and serving layers.
· Should be able to envision best database modeling to cater to visualization needs based on interactions with product owners and business owners
Job Title: Air Flow Data Engineer
Location: Austin, TX
Duration: 12 Months
Pay Range: $(57.00 - 62.00)/hr on W2 all-inclusive without benefits
Pay Range: $(62.00 - 71.00)/hr on C2C
Job Description:
· 5-7 years hands-on development experience in Fabric & Azure Air Flow
· Strong expertise in Spark and Airflow for building scalable, distributed data pipelines.
· Solid experience in Azure services and cloud-native architectures.
· Proven ability to design and implement end-to-end pipelines, including error handling, monitoring, alerting, and self-healing capabilities.
· Good understanding of data architecture principles, best practices for ingestion, transformation, and serving layers.
· Should be able to envision best database modeling to cater to visualization needs based on interactions with product owners and business owners