

Avance Consulting
Data Platform Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Platform Architect with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Microsoft Azure Data stack, ETL with Microsoft Fabric, Power BI, and strong data governance. Requires 7+ years of relevant experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 31, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Redmond, WA
-
🧠 - Skills detailed
#Datasets #Storage #Azure SQL #Data Quality #XML (eXtensible Markup Language) #BI (Business Intelligence) #Data Engineering #ADF (Azure Data Factory) #Scala #Python #Data Modeling #Security #Data Lake #Spark (Apache Spark) #Dataflow #Metadata #Azure #PySpark #Data Governance #JSON (JavaScript Object Notation) #Microsoft Power BI #ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Visualization #AI (Artificial Intelligence) #Microsoft Azure #Azure ADLS (Azure Data Lake Storage) #Azure Data Factory
Role description
Key Responsibilities
Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory.
Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns.
Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions.
Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics and AI tools.
Ensure data quality, lineage, and governance across all ingestion and transformation processes.
Collaborate with product teams to understand data needs and deliver scalable solutions.
Optimize performance and cost across storage and compute layers.
Required Qualifications
7+ years of experience in data engineering with a focus on Microsoft Azure Data stack and Fabric technologies.
Strong expertise in:
Executive written and oral communication
Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
Power BI and/or other visualization tools
SQL, Python and PySpark/ Scala
Experience working with structured and semi structured data (JSON, XML, CSV, Parquet).
Proven ability to build metadata driven architectures and reusable components.
Familiarity with agent based architectures and conversational AI integration is a plus.
Strong understanding of data modeling, data governance, and security best practices.
Key Responsibilities
Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory.
Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns.
Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions.
Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics and AI tools.
Ensure data quality, lineage, and governance across all ingestion and transformation processes.
Collaborate with product teams to understand data needs and deliver scalable solutions.
Optimize performance and cost across storage and compute layers.
Required Qualifications
7+ years of experience in data engineering with a focus on Microsoft Azure Data stack and Fabric technologies.
Strong expertise in:
Executive written and oral communication
Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
Power BI and/or other visualization tools
SQL, Python and PySpark/ Scala
Experience working with structured and semi structured data (JSON, XML, CSV, Parquet).
Proven ability to build metadata driven architectures and reusable components.
Familiarity with agent based architectures and conversational AI integration is a plus.
Strong understanding of data modeling, data governance, and security best practices.






