Enterprise Engineering Inc. (EEI)

Sr. Data Engineer with MDM Experience

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer with MDM experience, based in Iselin, NJ (Hybrid). Contract length is unspecified, with a pay rate of "unknown". Requires 12+ years in data engineering, banking experience, and expertise in Python, Kafka, and MDM solutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Iselin, NJ
-
🧠 - Skills detailed
#Data Integration #Deployment #Metadata #Azure Data Factory #Automation #Python #Microservices #PySpark #ADF (Azure Data Factory) #GraphQL #Security #SQL (Structured Query Language) #Azure #Airflow #Cloud #Databricks #Snowflake #Data Privacy #Scala #Compliance #Data Governance #Data Processing #Kafka (Apache Kafka) #MDM (Master Data Management) #Azure cloud #Spark (Apache Spark) #Data Quality #Data Management #Data Engineering #Talend #Informatica
Role description
Role :Sr. Data Engineer (MDM experience is MUST) Location : Iselin, NJ (Hybrid) • • • • Need local candidates only In-Person interview will be required. Recent 3-5 years of banking domain experience required. • • • • • • • No vendor resumes please Job Description: Job Summary: • We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. • The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. • You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs. Key Responsibilities: • Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains. • Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization. • Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met. • Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements. • Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively. • Implement data integration pipelines leveraging modern data engineering tools and practices. • Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer. • Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies. • Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services. • Ensure compliance with data governance, data privacy, and security standards. • Support CI/CD pipelines for continuous integration and deployment of data solutions. Required Qualifications: • 12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry. • Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options. • Strong functional knowledge of reference data sources and domain-specific data standards. • Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer. • Familiarity with CI/CD practices, tools, and automation pipelines. • Ability to work collaboratively across teams to deliver complex data solutions. • Experience with financial systems (capital markets, credit risk, and regulatory compliance applications). Preferred Qualifications: • Familiarity with financial data models and regulatory requirements. • Experience with Azure cloud platforms • Knowledge of data governance, data quality frameworks, and metadata management.