Databricks Solutions Architect Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Solutions Architect Lead on a 6-month contract, remote (US-based), at $120/hr. Requires 7+ years in Data Engineering, hands-on Databricks expertise, strong SQL skills, and proven consulting experience. Databricks certification preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
960
-
πŸ—“οΈ - Date discovered
September 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Cary, NC
-
🧠 - Skills detailed
#Security #DevOps #Consulting #Code Reviews #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #Automation #MLflow #BI (Business Intelligence) #Deployment #Scala #Databricks #Data Engineering #PySpark #Data Governance #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #DataOps #Spark (Apache Spark) #Strategy #Azure #ML (Machine Learning) #AI (Artificial Intelligence) #Vault #Data Vault #Leadership #Delta Lake #Data Strategy #Data Modeling #Cloud
Role description
Job Title: Databricks Solution Architect / Lead Consultant Industry: Professional Services Duration: 6-month initial contract (extension likely) Hours: Full-time, 40 hrs/week Pay Rate: $120/hr Location: Remote (US-based) Position Overview We are seeking an experienced Databricks Solution Architect / Lead Consultant with deep expertise in data engineering and cloud-based analytics. In this role, you will serve as the primary client-facing technical lead, driving solution design and architecture for enterprise-level Databricks Lakehouse implementations. You will translate complex business needs into scalable, cost-efficient data solutions, guide both local and offshore development teams, and ensure high-quality delivery. This is a high-visibility role with direct client interaction, where strong communication and leadership skills are as critical as your technical expertise. Key Responsibilities Client & Requirements Management β€’ Act as the primary technical liaison with clients, leading discussions to capture business needs, technical challenges, and data strategy goals. β€’ Conduct workshops and deep-dive sessions to gather detailed requirements for ingestion, transformation, modeling, and data consumption. Solution Design & Architecture β€’ Architect end-to-end data solutions on the Databricks Lakehouse Platform (Delta Lake, Unity Catalog, Spark, Photon, etc.). β€’ Produce detailed design deliverables (architecture diagrams, flow maps, technical specs, implementation plans). β€’ Establish best practices for data governance, performance tuning, security, and cost optimization across cloud environments (AWS, Azure, or GCP). Team Leadership & Delivery Oversight β€’ Provide direction to offshore development teams, ensuring alignment with approved architecture and quality standards. β€’ Translate designs into clear technical tasks, conduct code reviews, and provide ongoing technical guidance. β€’ Support integration, deployment, and troubleshooting during the build and release phases. β€’ Must-Have Qualifications β€’ 7+ years of experience in Data Engineering, Data Warehousing, or BI with a strong focus on solution design. β€’ Hands-on Databricks expertise, including: β€’ PySpark/Scala for building scalable pipelines. β€’ Delta Lake and Lakehouse architecture. β€’ Security and governance with Unity Catalog. β€’ Strong SQL skills and data modeling (Kimball, Data Vault, or similar). β€’ Experience working with at least one cloud provider’s data stack (AWS, Azure, or GCP). β€’ Proven consulting experience, with strong communication skills and ability to present technical concepts to non-technical stakeholders. β€’ Demonstrated ability to manage and direct offshore/global teams. Preferred Skills β€’ Databricks certification (Data Engineer Professional or higher). β€’ Familiarity with DevOps/DataOps practices (CI/CD, testing automation in Databricks). β€’ Exposure to ML/AI use cases and Databricks MLflow. β€’ Prior professional services/consulting background. Why Apply β€’ Direct involvement in enterprise-scale Databricks implementations. β€’ Opportunity to own solution architecture and directly advise stakeholders. β€’ Work in a global delivery environment with leadership visibility. β€’ Competitive compensation and long-term contract extension potential. #TECH #Remote