

Data Architect – Databricks
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect – Databricks with a contract length of "unknown" and a pay rate of "unknown." Requires 12+ years in data architecture, 5+ years in Databricks, expertise in Snowflake, and proficiency in Apache Spark and ETL/ELT pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 19, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Santa Clara, CA
-
🧠 - Skills detailed
#Security #Snowflake #Spark (Apache Spark) #AI (Artificial Intelligence) #Apache Airflow #Infrastructure as Code (IaC) #Tableau #Data Engineering #MLflow #BI (Business Intelligence) #Airflow #Apache Spark #AWS (Amazon Web Services) #Microsoft Power BI #Data Architecture #Compliance #dbt (data build tool) #Delta Lake #Databricks #ML (Machine Learning) #DevOps #Data Governance #Cloud #Data Modeling #"ETL (Extract #Transform #Load)" #Terraform #PySpark
Role description
Required Skills & Qualifications
• 12+ years of experience in data architecture, with 5+ years hands-on in Databricks.
• Strong Experience in Snowflake
• Experience in cloud platforms AWS, especially AWS Databricks.
• Strong proficiency in Apache Spark, Delta Lake, and PySpark.
• Experience with data modeling, ETL/ELT pipelines, and data warehousing.
• Familiarity with CI/CD, DevOps, and Infrastructure as Code (Terraform, ARM templates).
• Knowledge of data governance, security, and compliance frameworks.
• Excellent communication and stakeholder management skills.
Preferred Qualifications
• Databricks Certified Data Engineer or Architect.
• Experience with MLflow, Unity Catalog, and Lakehouse architecture.
• Background in machine learning, AI, or advanced analytics.
• Experience with tools like Apache Airflow, dbt, or Power BI/Tableau.
Required Skills & Qualifications
• 12+ years of experience in data architecture, with 5+ years hands-on in Databricks.
• Strong Experience in Snowflake
• Experience in cloud platforms AWS, especially AWS Databricks.
• Strong proficiency in Apache Spark, Delta Lake, and PySpark.
• Experience with data modeling, ETL/ELT pipelines, and data warehousing.
• Familiarity with CI/CD, DevOps, and Infrastructure as Code (Terraform, ARM templates).
• Knowledge of data governance, security, and compliance frameworks.
• Excellent communication and stakeholder management skills.
Preferred Qualifications
• Databricks Certified Data Engineer or Architect.
• Experience with MLflow, Unity Catalog, and Lakehouse architecture.
• Background in machine learning, AI, or advanced analytics.
• Experience with tools like Apache Airflow, dbt, or Power BI/Tableau.