Programmers.io

Data Governance Manager

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Governance Manager, remote, with a contract length of "unknown" and a pay rate of "$X per hour." Requires 8+ years in Data Governance, proficiency in Databricks Unity Catalog, and a Databricks Certified Data Engineer Professional certification.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Azure #Compliance #Microsoft Power BI #Migration #Data Quality #GDPR (General Data Protection Regulation) #Alation #Security #Azure Databricks #Spark (Apache Spark) #Tableau #Monitoring #Cloud #Data Governance #Spark SQL #"ETL (Extract #Transform #Load)" #PySpark #Databricks #BI (Business Intelligence) #AWS (Amazon Web Services) #Clustering #dbt (data build tool) #ML (Machine Learning) #Data Engineering #Delta Lake #SQL (Structured Query Language) #Strategy #Collibra
Role description
Senior Data Governance Consultant (Databricks & Unity Catalog) Location: Remote Overview: We are seeking a highly experienced Data Governance Consultant to design and operationalize an enterprise-grade governance framework. You will be the primary architect of our data trust strategy, leveraging Databricks Unity Catalog to unify discovery, security, and lineage across our multi-cloud Lakehouse environment. Key Responsibilities • Unity Catalog Architecture: Design and deploy global Unity Catalog Metastores, defining the three-tier namespace strategy (Catalog > Schema > Table/Volume) to support domain-driven architectures. • Access Control & Security: Implement fine-grained security models, including Attribute-Based Access Control (ABAC), row-level filtering, and column masking. • Legacy Migration: Lead the transition from Hive Metastore to Unity Catalog using UCX (Unity Catalog Migration Toolkit) and automated upgrade paths. • AI & Model Governance: Extend governance to AI assets, managing Mosaic AI models, functions, and GenAI agents within the Unity Catalog framework. • Automated Lineage: Configure end-to-end lineage tracking for Delta Live Tables (DLT), notebooks, and workflows to support audit and impact analysis. • Data Quality & Monitoring: Implement Databricks Lakehouse Monitoring to track data freshness, completeness, and statistical drift automatically. Technical Capabilities & Experience • Cataloging & Sharing: Deep expertise in Unity Catalog (Managed & External Tables), Volumes, and secure external data exchange via Delta Sharing. • Data Engineering Foundations: Proficiency in Delta Lake architecture, Delta Live Tables (DLT), PySpark, Spark SQL, and performance optimization via Liquid Clustering. • Identity & Access Management: Strong experience with Entra ID (Azure AD) ,SCIM Provisioning, and Managed Identities for secure resource access. • Platform Operations: Mastery of Databricks Workflows and the use of System Tables for monitoring audit logs, billing, and lineage. • Ecosystem Integration: Experience integrating Databricks with external catalogs (such as Collibra or Alation), BI tools (Power BI/Tableau), and transformation frameworks like dbt. • Compliance Frameworks: Ability to operationalize Microsoft Purview or similar tools to monitor PII/PHI and ensure adherence to GDPR, HIPAA, or CCPA. Required Qualifications • Experience: 8+ years in Data Governance/Management, with at least 3 years of hands-on experience specifically in Databricks. • Unity Catalog Mastery: Proven track record of at least two large-scale Unity Catalog implementations. • Cloud Proficiency: Deep experience in Azure Databricks or AWS Databricks environments. • Certifications: Required: Databricks Certified Data Engineer Professional. Preferred: Databricks Certified Machine Learning Professional or DAMA CDMP.