

TechDoQuest
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with a contract length of more than 6 months, offering a competitive pay rate. Key skills required include data modeling, data ingestion frameworks, and data governance. Experience in healthcare or financial services is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 8, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#BI (Business Intelligence) #AWS (Amazon Web Services) #Data Ingestion #Classification #Data Architecture #Data Lineage #"ETL (Extract #Transform #Load)" #Data Integration #MDM (Master Data Management) #Data Lake #Compliance #Cloud #Monitoring #Virtualization #GCP (Google Cloud Platform) #Data Framework #NoSQL #GDPR (General Data Protection Regulation) #Data Management #Leadership #Computer Science #SQL (Structured Query Language) #Data Privacy #Data Pipeline #Azure #Data Quality #API (Application Programming Interface) #Data Modeling #Data Catalog #Data Lifecycle #Batch #Data Governance #Data Science #Data Engineering #Metadata
Role description
Role Purpose
Define enterprise data architecture standards, create data exchange and ingestion frameworks, establish data quality and governance patterns, and develop the data domain models and templates that ensure consistent, high-quality data across the organization. This role creates enterprise data patterns and frameworks, not operational data solutions.
What Makes This Role Unique
• Data framework architect: Design the data ingestion framework with schema validation, quality checks, and exception handling used enterprise-wide
• Data quality champion: Embed quality into frameworks at ingestion, not as afterthought
• Exchange pattern owner: Create the patterns for internal and third-party data exchange
• Strategic partnership: Collaborate closely with Data Platform Enablement team (they operate the platform, you create the patterns)
Key Responsibilities
Enterprise Data Standards & Patterns (40%)
• Help define enterprise data modeling standards (conceptual, logical, physical, dimensional models)
• Create data domain model standards with quality framework integration
• Define data architecture patterns for common scenarios (OLTP, OLAP, streaming, hybrid)
• Document data persistence patterns (SQL vs NoSQL selection, data lake vs warehouse, caching)
• Create master data management patterns and reference data standards
• Establish data lifecycle management standards (retention, archival, purging)
Data Exchange & Ingestion Frameworks (35%)
Design enterprise data ingestion framework:
• Central pipeline architecture for all data ingestion (batch, streaming, real-time)
• Schema validation framework (enforce schemas at ingestion, schema registry)
• Data quality validation framework integrated into pipeline (completeness, accuracy, consistency, timeliness)
• Exception handling framework (quarantine, alerts, remediation workflows)
• Data lineage tracking through ingestion pipeline
Create data exchange patterns for:
• Internal system-to-system data sharing
• Third-party data exchange (inbound/outbound): SFTP/API patterns, data format standards, partner onboarding templates, data contract templates
• Real-time vs batch exchange criteria
Additional responsibilities:
• Define data integration patterns (ETL, ELT, streaming, CDC, data virtualization)
• Establish data API standards (data services, data APIs vs transactional APIs)
Data Quality & Governance Frameworks (15%)
• Create enterprise data quality framework (quality dimensions, metrics, rules by domain, monitoring, remediation)
• Define data governance patterns (stewardship model, cataloging standards, metadata management, lineage and impact analysis)
• Establish data privacy and protection patterns (PII handling, encryption, masking, tokenization)
• Document data classification framework (public, internal, confidential, restricted)
Roadmap & Enablement (10%)
• Develop enterprise data architecture modernization roadmap
• Train solution architects on data patterns and frameworks
• Review solution architectures for data pattern compliance
• Coordinate with Data Platform Enablement team on platform capabilities vs patterns
• Participate in data governance forums
Required Qualifications
Education:
• Bachelor's degree in Computer Science, Information Systems, Data Science, or related field
Experience:
• 12+ years in data architecture, data engineering, or enterprise architecture
• 5+ years creating enterprise data standards, frameworks, and patterns
• Proven experience designing data ingestion frameworks with quality and governance
• Experience with data exchange patterns for internal and external parties
• Track record establishing canonical data models and data governance frameworks
Certifications (Preferred):
• TOGAF certification
• Cloud data platform certification (AWS Data Analytics, Azure Data Engineer, Google Cloud Data Engineer)
• DAMA CDMP (Certified Data Management Professional)
• Data governance certification
Required Technical Skills
Skill Category
Required Skills
Proficiency Level
Data Standards
Data modeling standards (conceptual, logical, physical)
Expert
Data Standards
Canonical data model design
Expert
Data Standards
Data domain modeling
Expert
Data Ingestion Framework
Central pipeline architecture
Expert
Data Ingestion Framework
Schema validation framework
Expert
Data Ingestion Framework
Data quality framework integration
Expert
Data Ingestion Framework
Exception handling framework
Advanced
Data Exchange Patterns
Internal data exchange patterns
Expert
Data Exchange Patterns
Third-party data exchange patterns
Expert
Data Exchange Patterns
Data contract templates
Advanced
Data Quality Framework
Quality dimensions and metrics
Expert
Data Quality Framework
Quality validation rules
Advanced
Data Governance
Data governance frameworks
Advanced
Data Governance
Data cataloging standards
Advanced
Data Governance
Data lineage patterns
Advanced
Platform Knowledge
Cloud data platforms (AWS, Azure, GCP)
Advanced
Platform Knowledge
Data lake and warehouse patterns
Expert
Platform Knowledge
Data pipeline patterns
Advanced
Preferred Qualifications
• Experience in healthcare or financial services with complex data exchange requirements
• Track record implementing data quality frameworks at enterprise scale
• Experience with third-party data exchange and partner onboarding
• Published thought leadership on data architecture or data quality
• Experience with data privacy regulations (HIPAA, GDPR, CCPA)
Success Metrics (First 12 Months)
• Data quality validation pass rate >95% through ingestion framework
• Onboard 10+ data sources to central ingestion framework
• Establish data exchange patterns adopted by 80%+ of new integrations
• Canonical model coverage for 15+ core enterprise entities
• Data governance framework operational with 90%+ cataloging compliance
What You'll Deliver
• Data ingestion framework with schema validation, quality checks, exception handling
• Data exchange patterns for internal and third-party data
• Data quality framework integrated with domain models
• Data governance patterns and templates
• Data integration pattern library
• Data architecture reference architectures
• Data API standards
Working Relationships
Key Partnerships:
• Data Platform Enablement team (they operate platform, you create patterns)
• Solution architects (apply data patterns)
• Data stewards and data governance council
• Integration Architect (data exchange patterns)
• InfoSec team (data privacy and protection patterns)
Governance Participation:
• Architecture Review Board (bi-weekly)
• Data governance forums
• Pattern Review Sessions (bi-weekly)
Company Culture & Values
Our Enterprise Architecture team operates on principles of collaboration, excellence, and innovation:
• Pattern-first mindset: We create reusable blueprints that enable consistency and quality
• Partnership model: We work alongside operational teams (App Dev, Data Platform, Infrastructure, InfoSec) as strategic partners
• Continuous improvement: Patterns evolve based on feedback from implementation
• Enablement focus: Success means solution architects effectively apply our patterns
Executive backing: Strong leadership support for architecture governance and standards
Role Purpose
Define enterprise data architecture standards, create data exchange and ingestion frameworks, establish data quality and governance patterns, and develop the data domain models and templates that ensure consistent, high-quality data across the organization. This role creates enterprise data patterns and frameworks, not operational data solutions.
What Makes This Role Unique
• Data framework architect: Design the data ingestion framework with schema validation, quality checks, and exception handling used enterprise-wide
• Data quality champion: Embed quality into frameworks at ingestion, not as afterthought
• Exchange pattern owner: Create the patterns for internal and third-party data exchange
• Strategic partnership: Collaborate closely with Data Platform Enablement team (they operate the platform, you create the patterns)
Key Responsibilities
Enterprise Data Standards & Patterns (40%)
• Help define enterprise data modeling standards (conceptual, logical, physical, dimensional models)
• Create data domain model standards with quality framework integration
• Define data architecture patterns for common scenarios (OLTP, OLAP, streaming, hybrid)
• Document data persistence patterns (SQL vs NoSQL selection, data lake vs warehouse, caching)
• Create master data management patterns and reference data standards
• Establish data lifecycle management standards (retention, archival, purging)
Data Exchange & Ingestion Frameworks (35%)
Design enterprise data ingestion framework:
• Central pipeline architecture for all data ingestion (batch, streaming, real-time)
• Schema validation framework (enforce schemas at ingestion, schema registry)
• Data quality validation framework integrated into pipeline (completeness, accuracy, consistency, timeliness)
• Exception handling framework (quarantine, alerts, remediation workflows)
• Data lineage tracking through ingestion pipeline
Create data exchange patterns for:
• Internal system-to-system data sharing
• Third-party data exchange (inbound/outbound): SFTP/API patterns, data format standards, partner onboarding templates, data contract templates
• Real-time vs batch exchange criteria
Additional responsibilities:
• Define data integration patterns (ETL, ELT, streaming, CDC, data virtualization)
• Establish data API standards (data services, data APIs vs transactional APIs)
Data Quality & Governance Frameworks (15%)
• Create enterprise data quality framework (quality dimensions, metrics, rules by domain, monitoring, remediation)
• Define data governance patterns (stewardship model, cataloging standards, metadata management, lineage and impact analysis)
• Establish data privacy and protection patterns (PII handling, encryption, masking, tokenization)
• Document data classification framework (public, internal, confidential, restricted)
Roadmap & Enablement (10%)
• Develop enterprise data architecture modernization roadmap
• Train solution architects on data patterns and frameworks
• Review solution architectures for data pattern compliance
• Coordinate with Data Platform Enablement team on platform capabilities vs patterns
• Participate in data governance forums
Required Qualifications
Education:
• Bachelor's degree in Computer Science, Information Systems, Data Science, or related field
Experience:
• 12+ years in data architecture, data engineering, or enterprise architecture
• 5+ years creating enterprise data standards, frameworks, and patterns
• Proven experience designing data ingestion frameworks with quality and governance
• Experience with data exchange patterns for internal and external parties
• Track record establishing canonical data models and data governance frameworks
Certifications (Preferred):
• TOGAF certification
• Cloud data platform certification (AWS Data Analytics, Azure Data Engineer, Google Cloud Data Engineer)
• DAMA CDMP (Certified Data Management Professional)
• Data governance certification
Required Technical Skills
Skill Category
Required Skills
Proficiency Level
Data Standards
Data modeling standards (conceptual, logical, physical)
Expert
Data Standards
Canonical data model design
Expert
Data Standards
Data domain modeling
Expert
Data Ingestion Framework
Central pipeline architecture
Expert
Data Ingestion Framework
Schema validation framework
Expert
Data Ingestion Framework
Data quality framework integration
Expert
Data Ingestion Framework
Exception handling framework
Advanced
Data Exchange Patterns
Internal data exchange patterns
Expert
Data Exchange Patterns
Third-party data exchange patterns
Expert
Data Exchange Patterns
Data contract templates
Advanced
Data Quality Framework
Quality dimensions and metrics
Expert
Data Quality Framework
Quality validation rules
Advanced
Data Governance
Data governance frameworks
Advanced
Data Governance
Data cataloging standards
Advanced
Data Governance
Data lineage patterns
Advanced
Platform Knowledge
Cloud data platforms (AWS, Azure, GCP)
Advanced
Platform Knowledge
Data lake and warehouse patterns
Expert
Platform Knowledge
Data pipeline patterns
Advanced
Preferred Qualifications
• Experience in healthcare or financial services with complex data exchange requirements
• Track record implementing data quality frameworks at enterprise scale
• Experience with third-party data exchange and partner onboarding
• Published thought leadership on data architecture or data quality
• Experience with data privacy regulations (HIPAA, GDPR, CCPA)
Success Metrics (First 12 Months)
• Data quality validation pass rate >95% through ingestion framework
• Onboard 10+ data sources to central ingestion framework
• Establish data exchange patterns adopted by 80%+ of new integrations
• Canonical model coverage for 15+ core enterprise entities
• Data governance framework operational with 90%+ cataloging compliance
What You'll Deliver
• Data ingestion framework with schema validation, quality checks, exception handling
• Data exchange patterns for internal and third-party data
• Data quality framework integrated with domain models
• Data governance patterns and templates
• Data integration pattern library
• Data architecture reference architectures
• Data API standards
Working Relationships
Key Partnerships:
• Data Platform Enablement team (they operate platform, you create patterns)
• Solution architects (apply data patterns)
• Data stewards and data governance council
• Integration Architect (data exchange patterns)
• InfoSec team (data privacy and protection patterns)
Governance Participation:
• Architecture Review Board (bi-weekly)
• Data governance forums
• Pattern Review Sessions (bi-weekly)
Company Culture & Values
Our Enterprise Architecture team operates on principles of collaboration, excellence, and innovation:
• Pattern-first mindset: We create reusable blueprints that enable consistency and quality
• Partnership model: We work alongside operational teams (App Dev, Data Platform, Infrastructure, InfoSec) as strategic partners
• Continuous improvement: Patterns evolve based on feedback from implementation
• Enablement focus: Success means solution architects effectively apply our patterns
Executive backing: Strong leadership support for architecture governance and standards






