E-Solutions

Databricks Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Azure, Databricks, Python, and Streamlit. A bachelor's degree is required; certifications in Azure and Databricks are preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 25, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Iselin, NJ
-
🧠 - Skills detailed
#Databricks #PySpark #GIT #Deployment #Data Quality #Scala #Visualization #Monitoring #Agile #Python #Azure Databricks #Observability #Data Processing #Metadata #Infrastructure as Code (IaC) #Data Architecture #Terraform #Azure DevOps #Data Engineering #Computer Science #Spark (Apache Spark) #SQL (Structured Query Language) #Azure #Data Framework #Strategy #GitLab #DevOps #Automation #Streamlit #Automated Testing
Role description
Job Overview: We are looking for a hands-on Data Architect to design and lead the implementation of a robust data platform on Azure and Databricks. The successful candidate will be responsible for creating, testing, and improving data frameworks to optimize the functionality of our business Your primary focus will be to ensure our data is accurate, auditable, and resilient through advanced Data Observability frameworks. Responsibilities: Architecture Design: Architect scalable Medallion architectures (Bronze/Silver/Gold) in Azure Databricks Data Observability & Quality: Design automated frameworks for data quality to validate schemas, data and business logic in real-time. Reconciliation & Integrity: Build end-to-end reconciliation engines to ensure data consistency between source systems, the lakehouse, and downstream reports. Exception Handling: Develop a centralized exception handling strategy that captures, logs, and routes data processing errors without halting entire pipelines. User Interface: Build interactive Streamlit based user interface to provide stakeholders with capabilities to view/edit data quality rules and handling data exceptions DevOps & Automation: Implement CI/CD pipelines for data infrastructure using Azure DevOps/GitLab, focusing on automated testing and zero-touch deployments. Required Skills: Platform Mastery: Deep expertise in Azure and Databricks Observability Mindset: Proven track record of building metadata-driven frameworks for monitoring data "incidents" rather than just system uptime. Coding: High proficiency in Python, PySpark, and SQL. Visualization: Experience building data apps or internal tools via Streamlit. Agile Methodologies: Familiarity with agile development methodologies. DevOps: Strong understanding of Git-based workflows, Infrastructure as Code (Terraform), and automated testing. Communication Skills: Excellent written and verbal communication skills. The candidate must have a bachelor’s degree in Computer Science, Information Technology, or a related field; a Master's degree is preferred. Preferred Skills: Experience with Databricks DQX framework to manage data quality at scale Certifications: Azure Solutions Architect and/or Databricks Certified Data Engineer Professional.