Radinnova

Data Warehouse Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Architect on a contract basis, offering competitive pay. The position requires 5–10 years of experience in data warehousing, expertise in SQL, and proficiency with Microsoft Fabric or AWS. Familiarity with SAP and data governance frameworks is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 1, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Detroit Metropolitan Area
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka) #Athena #Observability #AWS (Amazon Web Services) #Data Accuracy #Data Catalog #Azure Data Factory #Redshift #BI (Business Intelligence) #Data Warehouse #Leadership #Metadata #Data Engineering #Automated Testing #Semantic Models #Cloud #dbt (data build tool) #Compliance #Vault #Batch #Data Architecture #AWS Glue #Data Extraction #Data Governance #Data Quality #Tableau #Data Modeling #Data Pipeline #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Physical Data Model #Security #Apache Airflow #Data Vault #MDM (Master Data Management) #Alation #Data Lineage #Collibra #Airflow #Scala #SQL (Structured Query Language) #Monitoring #Data Management #SAP #Azure
Role description
Position Overview We are seeking an experienced Data Warehouse Architect to design, build, and evolve our enterprise data platform. This role is central to unifying data across our organization's critical business domains — including Order to Cash (O2C), Procure to Pay (P2P), and Finance Record to Report (R2R) — into a scalable, governed, and performant data warehouse. The ideal candidate brings deep expertise in enterprise data modeling, modern cloud data platforms, and hands-on experience integrating data from both large-scale ERP systems and niche industry platforms. Key Responsibilities Data Architecture & Modeling • Design and own enterprise-grade dimensional and relational data models across core business domains: Order to Cash (O2C), Procure to Pay (P2P), and Finance Record to Report (R2R) • Develop and maintain conceptual, logical, and physical data models aligned to business process frameworks (e.g., APQC, SAP reference models) • Establish and enforce data modeling standards, naming conventions, and architectural best practices across the organization • Define and manage data domains, subject areas, and data products to support self-service analytics and enterprise reporting Data Pipeline & Integration • Architect and oversee end-to-end data pipelines — ingestion, transformation, orchestration, and delivery — for batch and near-real-time workloads • Lead the integration of structured and semi-structured data from ERP and operational systems into the data warehouse • Evaluate, select, and implement ETL/ELT tools and frameworks appropriate to the platform stack • Ensure pipeline reliability, observability, and scalability through monitoring, alerting, and automated testing practices Cloud Platform Engineering • Architect and manage cloud-based data warehouse environments on Microsoft Fabric and/or AWS (Redshift, Glue, S3, Lake Formation, or equivalent) • Design lakehouse and medallion architectures (Bronze / Silver / Gold) where appropriate • Optimize platform performance, cost, and security in alignment with enterprise cloud governance standards ERP & Source System Integration • Lead data extraction and mapping from enterprise ERP platforms, like SAP including complex financial, procurement, and sales modules • Integrate data from niche and industry-specific ERP platforms including Clipboard K8 and Stone Profit System • Collaborate with ERP functional teams to understand business processes, data flows, and source-system data quality characteristics • Document source-to-target mappings and maintain data lineage across all integrated systems Governance & Quality • Define and implement data governance frameworks, including data ownership, stewardship, and quality rules across integrated domains • Partner with data governance and compliance teams to ensure data accuracy, completeness, and auditability — especially for financial reporting (R2R) • Implement and maintain master data management (MDM) principles for key entities such as customers, vendors, and chart of accounts Leadership & Collaboration • Serve as the technical lead and subject matter expert for all data warehouse and data modeling initiatives • Mentor and guide data engineers and analysts on architecture standards and best practices • Partner with business stakeholders across Finance, Supply Chain, and Operations to translate requirements into scalable data solutions • Engage with vendors, consultants, and third-party data providers as needed Required Qualifications • 5–10 years of progressive experience in data warehousing, data architecture, or data engineering roles • Proven expertise designing enterprise data models for Order to Cash, Procure to Pay, and/or Finance Record to Report business domains • Strong proficiency in SQL and experience with dimensional modeling (Kimball, Inmon) and modern data vault methodologies • Hands-on experience building and managing data pipelines using tools such as dbt, Azure Data Factory, AWS Glue, Apache Airflow, or similar • Demonstrated experience with cloud data platforms — Microsoft Fabric and/or AWS (Redshift, S3, Glue, Athena, etc.) • Meaningful experience integrating data from SAP including financial, logistics, and procurement modules • Solid understanding of data governance, data quality frameworks, and metadata management Preferred Qualifications • Experience with niche ERP platforms including Clipboard K8 and/or Stone Profit System • Familiarity with Microsoft Fabric components: OneLake, Lakehouse, Data Pipelines, and Semantic Models • AWS certifications (e.g., AWS Certified Data Analytics – Specialty) or Microsoft certifications (e.g., DP-700 Fabric Analytics Engineer) • Experience with BI and reporting platforms such as Power BI, Tableau, or Phocas • Background in financial close processes, revenue recognition, or supply chain operations • Familiarity with data catalog and lineage tools (e.g., Purview, Alation, Collibra) • Experience with real-time or streaming data architectures (Kafka, Kinesis, Event Hubs)