

Bayforce
Cloud Data Warehouse Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Warehouse Architect with a 6+ month contract, offering a remote work location and a pay rate of "unknown." Requires 7+ years in data engineering, expertise in Azure, Databricks, and SAP integration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scala #Databricks #Security #Batch #Data Quality #Datasets #Microsoft Power BI #SQL Server #BI (Business Intelligence) #Computer Science #Delta Lake #Compliance #Data Engineering #Data Modeling #Data Architecture #PySpark #Cloud #Data Warehouse #SAP #Data Lake #MLflow #ADLS (Azure Data Lake Storage) #ML (Machine Learning) #Spark (Apache Spark) #SQL (Structured Query Language) #SAP Fiori #Data Governance #Data Catalog #Migration #Spark SQL #Visualization #SAP Analytics #Tableau #Clustering #AI (Artificial Intelligence) #Data Lineage #Azure #"ETL (Extract #Transform #Load)" #RDBMS (Relational Database Management System) #Snowflake #Synapse #Data Lakehouse #Metadata
Role description
•
• NO 3rd Party or Sponsorship!
Duration: 6 months + contract to hire
Location: Remote
We are seeking a Cloud Data Warehouse Architect to design and deliver a next-generation enterprise analytics platform. This highly technical role focuses on building a cloud-native, SAP-integrated, AI-ready architecture that supports analytics, reporting, and advanced machine learning at scale.
The architect will modernize the current BI and data warehouse environment—currently anchored in IBM Netezza, Cognos, and Tableau—into a scalable, cloud-based architecture.
This role requires deep technical expertise in data modeling, cloud-native design, and hybrid architectures that bridge legacy on-premises systems with cloud-first capabilities.
Key Responsibilities
Architectural Design & Modernization
• Lead the design of a cloud data warehouse and data lakehouse architecture capable of ingesting large-scale transactional and operational data
• Define integration strategies for core enterprise systems
• Develop a reference architecture leveraging Databricks and Delta Lake as core components
• Implement semantic modeling to unify reporting across Tableau, Power BI, and SAP Analytics Cloud (SAC)
Data Engineering & Performance
• Oversee ingestion pipelines for batch (Netezza extracts, flat files, nightly jobs) and near real-time (APIs, streaming) data sources
• Optimize query performance through partitioning, clustering, caching, and Delta Lake / warehouse design
• Establish reusable ETL/ELT patterns across Databricks notebooks, SQL-based orchestration, and integration with ActiveBatch scheduling
Governance, Security & Compliance
• Define and enforce data governance standards including naming conventions, metadata, lineage, and data quality
• Partner with Information Security on identity management (Azure AD), encryption, and RBAC/ABAC models
• Implement governance tooling such as Azure Purview, SAP metadata catalogs, Databricks Unity Catalog, and Glasswing
Collaboration & Enablement
• Partner with data engineering and visualization teams to deliver governed, high-performance datasets for Tableau, Power BI, SAC, and SAP Fiori
• Serve as the technical SME for architects, engineers, and analysts, ensuring alignment with cloud-native best practices
• Drive knowledge transfer from legacy platforms (Netezza, Cognos) into the modern analytics ecosystem
Qualifications
• Bachelor’s degree in Computer Science, Engineering, or a related field
• 7+ years of experience in data engineering, data warehouse architecture, or cloud data architecture
• Expertise in Azure services including ADLS, Synapse, Purview, Databricks, networking, and security
• Strong proficiency in Databricks (Delta Lake, PySpark, SQL) and/or Snowflake (warehouse design, scaling, security)
• Proven experience in data modeling (3NF, star schema, semantic layers)
• Deep SQL expertise across cloud and traditional RDBMS platforms (e.g., Netezza, SQL Server, Progress OpenEdge)
• Understanding of SAP S/4HANA integration and familiarity with SAP Datasphere
Preferred
• Experience migrating from on-prem MPP platforms (such as Netezza) to cloud-native architectures
• Familiarity with Cognos to Tableau or Power BI migrations and dashboard optimization
• Hands-on experience with SAP Analytics Cloud (SAC) and embedded analytics
• Knowledge of machine learning workflows and integration with Databricks MLflow or Azure ML
• Strong knowledge of data governance frameworks and tooling (Purview, Unity Catalog, SAC governance)
•
• NO 3rd Party or Sponsorship!
Duration: 6 months + contract to hire
Location: Remote
We are seeking a Cloud Data Warehouse Architect to design and deliver a next-generation enterprise analytics platform. This highly technical role focuses on building a cloud-native, SAP-integrated, AI-ready architecture that supports analytics, reporting, and advanced machine learning at scale.
The architect will modernize the current BI and data warehouse environment—currently anchored in IBM Netezza, Cognos, and Tableau—into a scalable, cloud-based architecture.
This role requires deep technical expertise in data modeling, cloud-native design, and hybrid architectures that bridge legacy on-premises systems with cloud-first capabilities.
Key Responsibilities
Architectural Design & Modernization
• Lead the design of a cloud data warehouse and data lakehouse architecture capable of ingesting large-scale transactional and operational data
• Define integration strategies for core enterprise systems
• Develop a reference architecture leveraging Databricks and Delta Lake as core components
• Implement semantic modeling to unify reporting across Tableau, Power BI, and SAP Analytics Cloud (SAC)
Data Engineering & Performance
• Oversee ingestion pipelines for batch (Netezza extracts, flat files, nightly jobs) and near real-time (APIs, streaming) data sources
• Optimize query performance through partitioning, clustering, caching, and Delta Lake / warehouse design
• Establish reusable ETL/ELT patterns across Databricks notebooks, SQL-based orchestration, and integration with ActiveBatch scheduling
Governance, Security & Compliance
• Define and enforce data governance standards including naming conventions, metadata, lineage, and data quality
• Partner with Information Security on identity management (Azure AD), encryption, and RBAC/ABAC models
• Implement governance tooling such as Azure Purview, SAP metadata catalogs, Databricks Unity Catalog, and Glasswing
Collaboration & Enablement
• Partner with data engineering and visualization teams to deliver governed, high-performance datasets for Tableau, Power BI, SAC, and SAP Fiori
• Serve as the technical SME for architects, engineers, and analysts, ensuring alignment with cloud-native best practices
• Drive knowledge transfer from legacy platforms (Netezza, Cognos) into the modern analytics ecosystem
Qualifications
• Bachelor’s degree in Computer Science, Engineering, or a related field
• 7+ years of experience in data engineering, data warehouse architecture, or cloud data architecture
• Expertise in Azure services including ADLS, Synapse, Purview, Databricks, networking, and security
• Strong proficiency in Databricks (Delta Lake, PySpark, SQL) and/or Snowflake (warehouse design, scaling, security)
• Proven experience in data modeling (3NF, star schema, semantic layers)
• Deep SQL expertise across cloud and traditional RDBMS platforms (e.g., Netezza, SQL Server, Progress OpenEdge)
• Understanding of SAP S/4HANA integration and familiarity with SAP Datasphere
Preferred
• Experience migrating from on-prem MPP platforms (such as Netezza) to cloud-native architectures
• Familiarity with Cognos to Tableau or Power BI migrations and dashboard optimization
• Hands-on experience with SAP Analytics Cloud (SAC) and embedded analytics
• Knowledge of machine learning workflows and integration with Databricks MLflow or Azure ML
• Strong knowledge of data governance frameworks and tooling (Purview, Unity Catalog, SAC governance)






