

Call Quest Solution
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, contract length unspecified, with a pay rate of "unknown." Remote work is allowed. Requires 10+ years in Data Engineering, expertise in Apache Airflow, dbt Core, and experience with Kubernetes/OpenShift.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Kubernetes #Scala #Observability #dbt (data build tool) #Batch #Data Quality #Apache Airflow #Airflow #SQL (Structured Query Language) #Data Pipeline #Python #Data Engineering
Role description
About the Role
Looking for a Senior Data Engineer with strong expertise in Apache Airflow, dbt Core, and Kubernetes/OpenShift to build and optimize scalable, enterprise data pipelines.
Key Responsibilities
• Design and manage Airflow DAGs for batch and event-driven pipelines
• Build and optimize dbt Core data models (staging, marts, testing)
• Deploy and manage workloads on Kubernetes/OpenShift
• Monitor and improve pipeline performance, scalability, and reliability
• Implement data quality, governance, and observability
• Collaborate with cross-functional teams on data solutions
Required Skills
• 10+ years in Data Engineering
• Strong expertise in Apache Airflow and dbt Core
• Proficiency in Python and SQL
• Experience with Kubernetes/OpenShift
• Experience with enterprise data platforms and large-scale processing
About the Role
Looking for a Senior Data Engineer with strong expertise in Apache Airflow, dbt Core, and Kubernetes/OpenShift to build and optimize scalable, enterprise data pipelines.
Key Responsibilities
• Design and manage Airflow DAGs for batch and event-driven pipelines
• Build and optimize dbt Core data models (staging, marts, testing)
• Deploy and manage workloads on Kubernetes/OpenShift
• Monitor and improve pipeline performance, scalability, and reliability
• Implement data quality, governance, and observability
• Collaborate with cross-functional teams on data solutions
Required Skills
• 10+ years in Data Engineering
• Strong expertise in Apache Airflow and dbt Core
• Proficiency in Python and SQL
• Experience with Kubernetes/OpenShift
• Experience with enterprise data platforms and large-scale processing






