Databricks Architect (Airflow)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect (Airflow) with a contract length of "unknown," offering a pay rate of "unknown" and remote work location. Requires 8+ years in data engineering, strong Databricks and Airflow expertise, and cloud platform experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
June 1, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Modeling #Monitoring #Cloud #SQL (Structured Query Language) #Big Data #Agile #Terraform #"ETL (Extract #Transform #Load)" #Apache Airflow #MLflow #Spark (Apache Spark) #DevOps #Data Architecture #Azure DevOps #Databricks #AWS (Amazon Web Services) #Airflow #Data Pipeline #GitHub #Delta Lake #Azure #GCP (Google Cloud Platform) #Python #Data Engineering
Role description
Required Skills and Qualifications: β€’ 8+ years of experience in data engineering, cloud architecture, or similar roles. β€’ 3+ years of hands-on experience with Databricks (Spark, Delta Lake). β€’ Strong expertise in Apache Airflow (DAG design, orchestration patterns, monitoring). β€’ Proficiency in Python, SQL, and data modeling. β€’ Deep understanding of big data architectures, ETL/ELT workflows, and streaming data pipelines. β€’ Experience with cloud platforms like Azure, AWS, or GCP. β€’ Familiarity with DevOps practices and CI/CD tools (e.g., GitHub Actions, Azure DevOps). β€’ Strong problem-solving skills and the ability to work in a fast-paced, agile environment. Preferred Qualifications: β€’ Databricks certification (e.g., Databricks Certified Data Engineer or Solutions Architect). β€’ Experience with MLflow, Delta Live Tables, or Unity Catalog. β€’ Experience in deploying data solutions in highly regulated industries. β€’ Knowledge of infrastructure-as-code tools like Terraform or ARM templates.