

Xpedient Technologies, LLC
Senior Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Databricks Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include PySpark, Informatica MDM, and data governance. Remote work location; experience with CJIS compliance is required.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Scala #MDM (Master Data Management) #Data Governance #Data Quality #Compliance #"ETL (Extract #Transform #Load)" #PySpark #Data Management #Databricks #Databases #Cloud #Leadership #Data Lineage #Data Pipeline #Data Integration #Spark (Apache Spark) #Informatica PowerCenter #Informatica #Delta Lake
Role description
Key Responsibilities
Databricks Lakehouse Engineering: Design and build complex, scalable data pipelines using PySpark and Delta Lake. Implement the Medallion Architecture (Bronze, Silver, Gold) to process TDCJ’s high-volume transactional data.
Master Data Management (MDM): Configure and maintain the Informatica MDM Hub (or IDMC). Design match/merge rules, survivorship logic, and trust frameworks to resolve identities across legacy mainframe systems and modern cloud databases.
Data Governance & Unity Catalog: Use Databricks Unity Catalog to manage fine-grained access control, ensuring CJIS (Criminal Justice Information Services) compliance and strict data lineage.
ETL/ELT Modernization: Migrate legacy Informatica PowerCenter workflows into Databricks Workflows or Informatica CDI (Cloud Data Integration) to improve processing speed and cost-efficiency.
Quality & Compliance: Implement automated data quality (DQ) checks using Informatica CDQ to flag anomalies in offender tracking and court-mandated reporting.
Technical Leadership: Act as the primary technical point of contact for the Data Management Office (DMO), mentoring junior engineers and collaborating with the Chief Data Officer (CDO).
Work Location: Remote
Key Responsibilities
Databricks Lakehouse Engineering: Design and build complex, scalable data pipelines using PySpark and Delta Lake. Implement the Medallion Architecture (Bronze, Silver, Gold) to process TDCJ’s high-volume transactional data.
Master Data Management (MDM): Configure and maintain the Informatica MDM Hub (or IDMC). Design match/merge rules, survivorship logic, and trust frameworks to resolve identities across legacy mainframe systems and modern cloud databases.
Data Governance & Unity Catalog: Use Databricks Unity Catalog to manage fine-grained access control, ensuring CJIS (Criminal Justice Information Services) compliance and strict data lineage.
ETL/ELT Modernization: Migrate legacy Informatica PowerCenter workflows into Databricks Workflows or Informatica CDI (Cloud Data Integration) to improve processing speed and cost-efficiency.
Quality & Compliance: Implement automated data quality (DQ) checks using Informatica CDQ to flag anomalies in offender tracking and court-mandated reporting.
Technical Leadership: Act as the primary technical point of contact for the Data Management Office (DMO), mentoring junior engineers and collaborating with the Chief Data Officer (CDO).
Work Location: Remote





