Databricks SME / Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Databricks SME / Architect" in Indianapolis, IN, on a long-term hybrid contract. Requires a Bachelor's in Computer Science, Databricks certification, 5+ years of relevant experience, and expertise in Azure Cloud, PySpark, and data governance.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 10, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Indianapolis, IN
-
🧠 - Skills detailed
#Compliance #Security #Azure DevOps #Observability #Vault #GDPR (General Data Protection Regulation) #ADLS (Azure Data Lake Storage) #JSON (JavaScript Object Notation) #PySpark #DevOps #Cloud #Azure #GitHub #Azure cloud #Computer Science #Databricks #Spark (Apache Spark) #Scala #Terraform #Data Engineering
Role description
We are hiring "Databricks SME / Architect" for Indianapolis, IN location. This is a long term Contract with Hybrid work mode. SME will build, operate, and govern production grade data and analytics solutions that span Databricks (Pipelines, Delta Live Tables, Genie, Agentβ€―Bricks). Will be responsible to deliver fast, reliable, and cost optimized data flows while maintaining enterprise grade security and observability. Required qualifications: β€’ Bachelor's degree in Computer Science or related field. β€’ Databricks Certified Data Engineer (Associate/Professional). β€’ At least 5 years of experience with Databricks Pipelines, Delta Live Tables, Genie, Agentβ€―Bricks; strong PySpark/Scala; Unity Catalog administration. β€’ Azure Cloud – ADLS Gen2, Event Hub, Service Bus, Azure Functions, Key Vault, Azure DevOps/GitHub Actions, Terraform/ARM. β€’ Data Modelling – Star schema, CDC, handling JSON/Parquet/Avro. β€’ Governance & Security – Unity Catalog, Microsoft Purview, row level security, GDPR/CCPA compliance. β€’ CI/CD & Testing – Automated unit/integration/end to end tests; GitOps workflow. β€’ Observability – Azure Monitor, Log Analytics, dashboards for pipeline health. β€’ Excellent communication and stakeholder management skills.