Lumicity

Senior Databricks Solution Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Databricks Solution Architect in Santa Clara, CA, with a 9–12 month contract and a focus on end-to-end platform design. Key skills include Databricks Lakehouse architecture, Spark expertise, and cloud platform experience (AWS/Azure).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1040
-
🗓️ - Date
April 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Santa Clara, CA
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #Data Modeling #Databricks #Spark SQL #Data Quality #Data Design #Snowflake #Data Ingestion #Monitoring #Spark (Apache Spark) #Strategy #Storage #Scala #Azure #Data Governance #Leadership #Data Pipeline #Observability #"ETL (Extract #Transform #Load)" #Data Lineage #Delta Lake #AWS (Amazon Web Services) #Cloud #R #PySpark #SQL (Structured Query Language)
Role description
Senior Databricks Solution Architect (Lakehouse Platform | AWS/Azure) Location: Santa Clara, CA (Onsite 5 days a week) Contract: 9–12 months The Opportunity We’re supporting a team building and evolving a modern data platform on Databricks, with a focus on scalability, governance, and production-grade architecture. This role is not pipeline development—it’s end-to-end platform design. They need someone who can own the architecture of the Databricks environment, define best practices, and ensure the platform is built correctly from the ground up. What You’ll Be Doing • Lead the end-to-end architecture of a Databricks Lakehouse platform, including data ingestion, transformation, storage, and serving layers • Define and implement best practices across Medallion architecture (Bronze/Silver/Gold) and scalable data design patterns • Design and enforce data governance frameworks using Unity Catalog, RBAC, and data lineage strategies • Architect data pipelines and workflows using Delta Live Tables, Databricks Jobs, and orchestration tools • Optimize platform performance through cluster configuration, workload management, and cost optimization strategies • Establish data quality, observability, and monitoring frameworks across the platform • Collaborate with engineering and business teams to translate requirements into scalable, production-ready data solutions • Provide technical leadership and guidance to engineers, ensuring adherence to architectural standards What They’re Looking For Must-Have: • Proven experience as a Databricks Solution Architect (not just engineer-level usage) • Deep expertise in Databricks Lakehouse architecture, including Delta Lake, Unity Catalog, Delta Live Tables, and Medallion architecture • Strong experience designing end-to-end data platforms in Databricks (not just pipelines) • Hands-on expertise with Spark (PySpark / Spark SQL) • Experience with cloud platforms (AWS and/or Azure) in a data platform context • Strong understanding of data modeling, governance, and scalable architecture patterns Nice-to-Have: • Experience with multi-tenant or platform-style data environments • Exposure to real-time or streaming architectures (Kafka, Structured Streaming) • Experience integrating Databricks with Snowflake or other downstream systems • Background in R&D, manufacturing, or high-scale enterprise environments What Makes This Role Different • True architecture ownership, not just pipeline development • Opportunity to shape a modern Databricks platform from a design perspective • High visibility with leadership and direct impact on platform strategy • Blend of technical depth and architectural leadership