

Tenth Revolution Group
Azure Databricks Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Architect, fully remote (must be based in Texas), with a minimum 6-month contract at a pay rate of "unknown." Key skills include Databricks, Spark, PySpark, and cloud data platforms.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
December 20, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Texas, United States
-
🧠 - Skills detailed
#Data Modeling #Spark (Apache Spark) #Observability #Data Architecture #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Data Engineering #Delta Lake #Data Pipeline #Security #Scala #Azure #Databricks #SQL (Structured Query Language) #Consulting #Cloud #Azure Databricks #PySpark #Batch #Spark SQL #Data Governance
Role description
Location: Fully remote (must be based in Texas)
Contract: Minimum 6 months, 40 hours per week (likely extension)
A U.S.-based consulting firm is seeking an experienced Databricks Architect to support enterprise, client-facing data platform initiatives. This is a hands-on architecture and delivery role focused on modern lakehouse platforms and Databricks-led implementations.
Responsibilities
• Design and own Databricks-based lakehouse architectures (Medallion, Lambda, Kappa)
• Lead data modeling, governance, security, and scalability decisions
• Build and optimize Spark-based data pipelines using PySpark and Spark SQL
• Define CI/CD patterns, ingestion strategies (batch and streaming), and observability standards
• Act as technical lead for data engineering teams and perform architecture reviews
Required Experience
• Prior experience as a Data Architect or Lead Engineer on enterprise data platforms
• Strong hands-on Databricks experience (Delta Lake, Unity Catalog, DLT, Structured Streaming)
• Advanced Spark and PySpark expertise
• Experience with cloud data platforms, ETL/ELT tooling, and CI/CD pipelines
• Solid understanding of cloud security and data governance
Note: Candidates must be based in Texas. Fully remote engagement.
Location: Fully remote (must be based in Texas)
Contract: Minimum 6 months, 40 hours per week (likely extension)
A U.S.-based consulting firm is seeking an experienced Databricks Architect to support enterprise, client-facing data platform initiatives. This is a hands-on architecture and delivery role focused on modern lakehouse platforms and Databricks-led implementations.
Responsibilities
• Design and own Databricks-based lakehouse architectures (Medallion, Lambda, Kappa)
• Lead data modeling, governance, security, and scalability decisions
• Build and optimize Spark-based data pipelines using PySpark and Spark SQL
• Define CI/CD patterns, ingestion strategies (batch and streaming), and observability standards
• Act as technical lead for data engineering teams and perform architecture reviews
Required Experience
• Prior experience as a Data Architect or Lead Engineer on enterprise data platforms
• Strong hands-on Databricks experience (Delta Lake, Unity Catalog, DLT, Structured Streaming)
• Advanced Spark and PySpark expertise
• Experience with cloud data platforms, ETL/ELT tooling, and CI/CD pipelines
• Solid understanding of cloud security and data governance
Note: Candidates must be based in Texas. Fully remote engagement.






