

Nexwave
Azure Data Platform Architect with ETL
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Platform Architect with ETL in Redmond, WA, onsite from day one. It requires 7+ years in data engineering, expertise in Microsoft Azure Data stack, and strong skills in SQL, Python, and Power BI.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
December 24, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Redmond, WA
-
π§ - Skills detailed
#Microsoft Power BI #Datasets #Data Modeling #XML (eXtensible Markup Language) #Security #Data Engineering #Azure ADLS (Azure Data Lake Storage) #Python #Storage #"ETL (Extract #Transform #Load)" #JSON (JavaScript Object Notation) #Azure #ADF (Azure Data Factory) #Azure SQL #Data Quality #Metadata #Dataflow #PySpark #BI (Business Intelligence) #Scala #Visualization #Spark (Apache Spark) #Data Governance #AI (Artificial Intelligence) #Data Lake #ADLS (Azure Data Lake Storage) #Microsoft Azure #Azure Data Factory #SQL (Structured Query Language)
Role description
Role : Azure Data Platform Architect with ETL
Location : Redmond, WA ( Onsite from day 1 )
Key Responsibilities
Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory.
Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns.
Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions.
Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics and AI tools.
Ensure data quality, lineage, and governance across all ingestion and transformation processes.
Collaborate with product teams to understand data needs and deliver scalable solutions.
Optimize performance and cost across storage and compute layers.
Required Qualifications
7+ years of experience in data engineering with a focus on Microsoft Azure Data stack and Fabric technologies.
Strong expertise in:
Executive written and oral communication
Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
Power BI and/or other visualization tools
SQL, Python and PySpark/ Scala
Experience working with structured and semi structured data (JSON, XML, CSV, Parquet).
Proven ability to build metadata driven architectures and reusable components.
Familiarity with agent based architectures and conversational AI integration is a plus.
Strong understanding of data modeling, data governance, and security best practices.
Role : Azure Data Platform Architect with ETL
Location : Redmond, WA ( Onsite from day 1 )
Key Responsibilities
Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory.
Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns.
Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions.
Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics and AI tools.
Ensure data quality, lineage, and governance across all ingestion and transformation processes.
Collaborate with product teams to understand data needs and deliver scalable solutions.
Optimize performance and cost across storage and compute layers.
Required Qualifications
7+ years of experience in data engineering with a focus on Microsoft Azure Data stack and Fabric technologies.
Strong expertise in:
Executive written and oral communication
Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
Power BI and/or other visualization tools
SQL, Python and PySpark/ Scala
Experience working with structured and semi structured data (JSON, XML, CSV, Parquet).
Proven ability to build metadata driven architectures and reusable components.
Familiarity with agent based architectures and conversational AI integration is a plus.
Strong understanding of data modeling, data governance, and security best practices.





