

SPADTEK SOLUTIONS
Resident Solution Architect – Databricks
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Resident Solution Architect – Databricks, remote, with a contract length of "unknown" and a pay rate of "unknown." Requires 12+ years in data engineering, strong Databricks expertise, cloud experience (AWS/Azure/GCP), and relevant certifications.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Leadership #Microsoft Azure #AI (Artificial Intelligence) #Documentation #Programming #Delta Lake #Data Pipeline #SQL (Structured Query Language) #Kafka (Apache Kafka) #Apache Spark #Cloud #Security #Monitoring #Data Warehouse #Deployment #Data Governance #Azure #DevOps #Scala #Data Engineering #Version Control #Data Lake #Migration #MLflow #AWS (Amazon Web Services) #Consulting #Databricks #Spark (Apache Spark) #Terraform #Big Data #Infrastructure as Code (IaC) #GCP (Google Cloud Platform) #Automation #ML (Machine Learning) #Python
Role description
Job Title: Resident Solution Architect – Databricks
Experience: 12+ Years
Location: Remote
Role Overview
We are looking for an experienced Resident Solution Architect with strong expertise in the Databricks Lakehouse Platform to design, architect, and deliver scalable, secure, and production-grade Data & AI solutions.
The ideal candidate should have deep hands-on technical expertise, strong consulting experience, and proven experience leading enterprise-level data platform implementations and migrations across cloud environments.
Key Responsibilities
• Solution Architecture & Delivery
• Design and build production-ready reference architectures using Lakehouse and Delta Lake best practices
• Architect scalable big data and AI solutions on Databricks
• Lead migrations (ETL/ELT, data warehouses, legacy systems) to modern Lakehouse architecture
• Provide architecture consulting, cluster optimization, and performance tuning
• Implement data governance and security using Unity Catalog
• Customer Engagement & Delivery Management
• Scope, plan, and manage technical engagements
• Drive end-to-end project delivery (Design → Development → Deployment → Optimization)
• Manage timelines, risks, and deliverables
• Provide support for complex production issues
• Platform Engineering & DevOps
• Implement CI/CD pipelines for code and infrastructure
• Deploy infrastructure using Terraform and Databricks Asset Bundles (DAB)
• Set up job scheduling, monitoring, and production management
• Establish best practices for version control and automation
• Optimization & Continuous Improvement
• Monitor and optimize data pipelines and ML models
• Improve system performance and cost efficiency
• Contribute reusable assets and documentation
• Enable customer teams through knowledge transfer
• Leadership & Enablement
• Mentor and train customer teams
• Provide technical leadership across engagements
• Support pre-sales activities and architecture discussions
Experience (Mandatory)
Required Skills & Qualifications
12+ years of hands-on experience in:
• Data Engineering
• Data Platforms
• Data Analytics
• Data Warehousing
• Big Data technologies (Kafka, Data Lakes, Cloud-native solutions)
Cloud Expertise
Hands-on Experience In At Least One
• AWS
• Microsoft Azure
• Google Cloud Platform (GCP)
Programming Skills
• Python
• SQL
• Scala
Databricks Expertise
Strong Hands-on Experience With
• Databricks SQL
• Apache Spark
• Delta Lake
• MLflow
• Unity Catalog
• Delta Live Tables (DLT)
Migration & Architecture
• Experience leading enterprise workload migrations
• Strong knowledge of ETL/ELT design patterns
• Experience modernizing legacy data systems
Deployment & Automation
• Databricks Asset Bundles (DAB)
• Terraform
• CI/CD pipelines
• Infrastructure as Code (IaC)
Certifications
• Databricks Certified Data Engineer Associate
• Databricks Certified Data Engineer Professional (Preferred)
Soft Skills
• Excellent communication
• Strong stakeholder management
• Leadership & mentoring ability
• Consulting and problem-solving skills
Job Title: Resident Solution Architect – Databricks
Experience: 12+ Years
Location: Remote
Role Overview
We are looking for an experienced Resident Solution Architect with strong expertise in the Databricks Lakehouse Platform to design, architect, and deliver scalable, secure, and production-grade Data & AI solutions.
The ideal candidate should have deep hands-on technical expertise, strong consulting experience, and proven experience leading enterprise-level data platform implementations and migrations across cloud environments.
Key Responsibilities
• Solution Architecture & Delivery
• Design and build production-ready reference architectures using Lakehouse and Delta Lake best practices
• Architect scalable big data and AI solutions on Databricks
• Lead migrations (ETL/ELT, data warehouses, legacy systems) to modern Lakehouse architecture
• Provide architecture consulting, cluster optimization, and performance tuning
• Implement data governance and security using Unity Catalog
• Customer Engagement & Delivery Management
• Scope, plan, and manage technical engagements
• Drive end-to-end project delivery (Design → Development → Deployment → Optimization)
• Manage timelines, risks, and deliverables
• Provide support for complex production issues
• Platform Engineering & DevOps
• Implement CI/CD pipelines for code and infrastructure
• Deploy infrastructure using Terraform and Databricks Asset Bundles (DAB)
• Set up job scheduling, monitoring, and production management
• Establish best practices for version control and automation
• Optimization & Continuous Improvement
• Monitor and optimize data pipelines and ML models
• Improve system performance and cost efficiency
• Contribute reusable assets and documentation
• Enable customer teams through knowledge transfer
• Leadership & Enablement
• Mentor and train customer teams
• Provide technical leadership across engagements
• Support pre-sales activities and architecture discussions
Experience (Mandatory)
Required Skills & Qualifications
12+ years of hands-on experience in:
• Data Engineering
• Data Platforms
• Data Analytics
• Data Warehousing
• Big Data technologies (Kafka, Data Lakes, Cloud-native solutions)
Cloud Expertise
Hands-on Experience In At Least One
• AWS
• Microsoft Azure
• Google Cloud Platform (GCP)
Programming Skills
• Python
• SQL
• Scala
Databricks Expertise
Strong Hands-on Experience With
• Databricks SQL
• Apache Spark
• Delta Lake
• MLflow
• Unity Catalog
• Delta Live Tables (DLT)
Migration & Architecture
• Experience leading enterprise workload migrations
• Strong knowledge of ETL/ELT design patterns
• Experience modernizing legacy data systems
Deployment & Automation
• Databricks Asset Bundles (DAB)
• Terraform
• CI/CD pipelines
• Infrastructure as Code (IaC)
Certifications
• Databricks Certified Data Engineer Associate
• Databricks Certified Data Engineer Professional (Preferred)
Soft Skills
• Excellent communication
• Strong stakeholder management
• Leadership & mentoring ability
• Consulting and problem-solving skills






