MNR Consulting Services

Senior Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect (Remote) with a contract length of "unknown" and a pay rate of "$XX/hour." Requires a Bachelor's/Master's in a related field, 10+ years in data management, and expertise in Data Mesh and Data Fabric principles.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Analysis #Data Modeling #Data Warehouse #Data Lineage #Kafka (Apache Kafka) #Scala #Data Profiling #Documentation #Data Governance #Data Pipeline #Snowflake #Data Science #Physical Data Model #Metadata #Data Quality #ERWin #Data Privacy #Python #Databricks #Azure Data Factory #Compliance #Data Integrity #SQL Queries #"ETL (Extract #Transform #Load)" #Data Ingestion #SQL (Structured Query Language) #Alation #Data Catalog #Data Architecture #Cloud #AWS Glue #Security #Azure #Collibra #Data Management #Scripting #ADF (Azure Data Factory) #Complex Queries #AWS (Amazon Web Services) #Computer Science #GDPR (General Data Protection Regulation) #Data Engineering
Role description
Data Architect(Remote) Job Summary: We are seeking an experienced Data Architect to design and build the unified data model supporting our Operational Data Store (ODS) initiative. This is a working architect role requiring both strategic data modeling expertise and hands-on technical execution. The ideal candidate will bridge domain-level data ownership (Data Mesh principles) with enterprise-wide integration (Data Fabric patterns), creating a unified data layer that respects source system context while enabling cross-domain analytics and operations. This role requires someone who can define the architectural vision while personally building data models, writing transformation logic, and validating implementations against source systems. This is a remote, work at home opportunity in the US. Key Responsibilities: • Design and build the unified data model for the Operational Data Store, balancing domain-specific context with enterprise integration requirements. • Define the architectural approach for federating domain data products into a cohesive enterprise data layer without creating monolithic dependencies. • Develop data transformation specifications, mapping rules, and working examples that preserve domain semantics while enabling cross-domain consistency. • Establish canonical data models that integrate across domain boundaries while respecting source system ownership and business context. • Perform source system analysis, data profiling, and gap assessments to understand domain data products and their integration requirements. • Write and validate SQL queries, transformation logic, and data quality rules to prove out architectural decisions before handoff. • Define system of record ownership across domains and maintain accurate data lineage documentation for federated data sources. • Design integration patterns that allow domains to evolve independently while maintaining enterprise data consistency. • Collaborate directly with Data Engineers during implementation, troubleshooting data quality issues and refining transformation logic. • Partner with domain stakeholders and enterprise architects to align domain data products with cross-domain analytics and operational needs. • Establish federated data governance standards that balance domain autonomy with enterprise consistency requirements. • Conduct architecture reviews focusing on data integrity, performance optimization, and scalability of the ODS. • Ensure compliance with data privacy regulations, security standards, and audit requirements in financial services. • Stay current with industry trends in Data Mesh, Data Fabric, and regulatory changes in the FinTech sector. Required Qualifications: • Bachelor's or master's degree in computer science, Information Systems, Data Science, or related field. • 10+ years of experience in data management with at least 5+ years in data architecture roles. • Strong understanding of Data Mesh principles (domain ownership, data as product, federated governance) and Data Fabric concepts (unified access, integration layer, cross-domain visibility). • Proven expertise in logical and physical data modeling using tools such as ERwin, PowerDesigner, or similar. • Experience designing canonical data models that integrate multiple domain data sources while preserving business context. • Strong hands-on SQL skills with ability to write complex queries for data profiling, validation, and transformation. • Experience building Operational Data Stores, data warehouses, or enterprise data hubs with multiple source system integrations. • Demonstrated ability to perform data profiling, source system analysis, and data quality assessments independently. • Hands-on experience developing data transformation specifications and mapping documentation that engineers can implement. • Working knowledge of ETL/ELT patterns and data pipeline architecture. • Familiarity with event streaming platforms (Kafka, Kinesis) for real time data ingestion scenarios. • Solid understanding of data governance, metadata management, and data lineage practices. Preferred Qualifications: • Demonstrates judgment and flexibility - positively deals with shifting priorities and rapid change of environments. • Experience in financial services, insurance, or large scale enterprise data platforms. • Experience implementing Data Mesh architectures or federated data governance models. • Proficiency with Python or similar scripting language for data analysis and validation. • Familiarity with cloud data services (AWS Glue, Azure Data Factory, Snowflake, Databricks). • Knowledge of data catalog and lineage tools (Collibra, Alation, Apache Atlas). • Exposure to domain-driven design principles applied to data architecture. • Understanding of regulatory requirements (SOX, GDPR, CCPA) as they apply to data management. • Experience mentoring Data Engineers and establishing team technical standards.