MDM Lead Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an MDM Lead Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include expertise in MDM tools, Azure Cloud Services, ETL/ELT processing, and automation scripting. Experience with Databricks and security protocols is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 3, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Dataiku #Ansible #Kubernetes #Airflow #Docker #CHEF #"ETL (Extract #Transform #Load)" #Bash #Apache Airflow #MDM (Master Data Management) #Alation #Monitoring #Azure cloud #Data Orchestration #Azure #Data Management #Logging #Data Catalog #Scripting #Data Aggregation #Security #ADF (Azure Data Factory) #DevOps #Puppet #Linux #Snowflake #Data Quality #Databricks #Azure Data Factory #Vault #Cloud #Python
Role description
Position Summary We are looking for a lead to manage the offshore team and ensure the reliability of software applications needed to support our data needs.Β The candidate should have an excellent track record of supporting Azure cloud-based services and applications. The individual should have an understanding of the present IT landscape, and how current trends can affect business.Β The ideal candidate should also have a strong understanding of automating systems engineering tasks and systems integration. Required: Should be expert in at least one MDM tool Prefer knowledge in Reference Data, Master Data management, Snowflake, Python, ETL/ELT processing Should be able to help customer on MDM tool selection Strong understanding of Windows systems and Azure Cloud Services, especially Azure Data Factory (ADF) and Azure Key Vault (AKV). Strong understanding of DevOps and complete SDLC of an application. Experienced in design and implementation of infrastructure and onboarding of new applications. Experienced working on Databricks workspaces, workload optimisation, troubleshooting and support. Experienced in logging and monitoring tools and incident management. Proven history of automating tasks through scripting (Bash, Python, Powershell). Experience with configuration management (e.g. Puppet, Chef, Ansible). Experience working with security protocols (i.e. OAuth, SCIM, Key Vaults, TLS, IDM). Familiarity with various software development processes and methodologies, especially related to ETL. Familiarity with container technologies (ie: Docker, Kubernetes). Experience with Red Hat Linux. Nice to have: Desire to continuously discover, experiment with, and evaluate new technologies. Experience with data aggregation tools (Cube Cloud), data cataloging (Alation), self-service analytics tools (Dataiku), data linage tools (Solidatus), or with data quality tools (Anamolo) is a plus. Experience with data orchestration platforms like Apache Airflow.