Tenth Revolution Group

Databricks Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect, fully remote within specific U.S. states, on a minimum 6-month contract at 40 hours per week. Requires strong Databricks and Spark expertise, experience with cloud data platforms, and data governance knowledge.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Arizona, United States
-
🧠 - Skills detailed
#Batch #Data Modeling #Cloud #Spark SQL #Delta Lake #Databricks #Observability #PySpark #Security #Data Pipeline #Lambda (AWS Lambda) #Data Engineering #Consulting #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Governance #Data Architecture #Scala
Role description
Location: Fully remote (must be based in Texas, Arizona, Tennessee, Florida, Indiana, or Georgia) Contract: Minimum 6 months, 40 hours per week (likely extension) A U.S.-based consulting firm is seeking an experienced Databricks Architect to support enterprise, client-facing data platform initiatives. This is a hands-on architecture and delivery role focused on modern lakehouse platforms and Databricks-led implementations. Responsibilities • Design and own Databricks-based lakehouse architectures (Medallion, Lambda, Kappa) • Lead data modeling, governance, security, and scalability decisions • Build and optimize Spark-based data pipelines using PySpark and Spark SQL • Define CI/CD patterns, ingestion strategies (batch and streaming), and observability standards • Act as technical lead for data engineering teams and perform architecture reviews Required Experience • Prior experience as a Data Architect or Lead Engineer on enterprise data platforms • Strong hands-on Databricks experience (Delta Lake, Unity Catalog, DLT, Structured Streaming) • Advanced Spark and PySpark expertise • Experience with cloud data platforms, ETL/ELT tooling, and CI/CD pipelines • Solid understanding of cloud security and data governance