Compunnel Inc.

Databricks Solutions Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Solutions Architect with a 12-month contract, offering $65.00 - $70.00 per hour. Required skills include 10+ years in data architecture, 4+ years with Databricks, Delta Lake expertise, and strong SQL proficiency. On-site work in "Child Skill" location.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date
March 20, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Juno Beach, FL 33408
-
🧠 - Skills detailed
#BI (Business Intelligence) #Computer Science #Delta Lake #Consulting #Kafka (Apache Kafka) #Data Security #Leadership #Scala #AWS (Amazon Web Services) #PySpark #Spark (Apache Spark) #Batch #Cloud #SQL (Structured Query Language) #Security #Migration #"ETL (Extract #Transform #Load)" #Deployment #Azure #Data Engineering #Storage #Databricks #Semantic Models #Strategy
Role description
Job Summary seeking an experienced Databricks Solutions Architect to lead the design and modernization of enterprise data platforms leveraging Databricks and cloud-native technologies. This individual will serve as the primary architectural authority on assigned engagements, translating business requirements into scalable, secure, and high-performing lakehouse solutions. In this role, you will drive ingestion architecture, transformation design, semantic modeling oversight, governance alignment, and performance optimization strategy. You will work closely with Program Leadership, Lead Engineers, Governance teams, and BI stakeholders to ensure the platform architecture supports both immediate delivery objectives and long-term enterprise scalability. This position requires strong client-facing consulting skills, architectural depth, and hands-on familiarity with Databricks implementation patterns. Key Responsibilities: Lead end-to-end solution architecture for Databricks-based data platforms. TA01 Design scalable ingestion frameworks (batch and streaming) aligned to enterprise data volumes and SLAs. Architect Medallion (raw/src/core/mart) lakehouse patterns and enforce clear data layer separation. Define enterprise Delta Lake architecture standards including schema evolution, partitioning strategy, performance optimization, and storage design. Provide semantic modeling oversight to ensure alignment between transformation logic and downstream BI/reporting requirements. Design environment strategies (dev/qa/prod), workspace structuring, and deployment promotion models. Establish governance alignment including Unity Catalog structure, RBAC strategy, lineage design, and data security standards. Conduct architecture workshops, whiteboarding sessions, and technical design reviews with stakeholders. Define integration patterns with upstream systems, APIs, and enterprise data sources. Partner with engineering teams to ensure architectural standards are implemented correctly. Identify performance risks and scalability concerns early and propose mitigation strategies. Contribute to roadmap development, modernization strategy, and technical debt reduction planning. Support pre-sales efforts including solution scoping, technical discovery, and proposal development. Required Skills & Experience: 10+ years of experience in data engineering and architecture roles. 4+ years designing and implementing Databricks-based solutions. Deep expertise in Delta Lake architecture and performance optimization. Strong understanding of Medallion lakehouse architecture patterns. Advanced proficiency in SQL and working knowledge of PySpark. Experience designing scalable ingestion and transformation frameworks. Hands-on experience implementing Unity Catalog and enterprise governance controls. Experience designing semantic models and aligning architecture to BI/reporting use cases. Strong client-facing consulting and presentation skills. Experience leading technical design sessions and architecture governance reviews. Bachelor’s degree in Computer Science, Information Systems, or equivalent experience. Preferred Qualifications: Databricks certifications. Experience with streaming architectures (Kafka, Event Hubs, or Delta Live Tables). Experience designing CI/CD deployment frameworks for Databricks. Background in enterprise cloud environments (Azure or AWS). Experience in large-scale modernization or platform migration initiatives Pay: $65.00 - $70.00 per hour Work Location: In person