Vidorra Consulting Group

Microsoft Fabric Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Microsoft Fabric Data Architect with a contract length of "unknown," offering a pay rate of "$/hour." Candidates should have 12+ years of SQL experience, 5+ years in Azure data technologies, and expertise in Microsoft Fabric and ETL/ELT tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 16, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Data Lake #Data Engineering #Programming #Cloud #Data Management #PySpark #Data Modeling #Azure cloud #Azure ADLS (Azure Data Lake Storage) #Deployment #Databricks #Data Architecture #Azure SQL #Synapse #Azure Data Factory #Leadership #SQL (Structured Query Language) #ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #Data Warehouse #ADLS (Azure Data Lake Storage) #Spark (Apache Spark) #Storage #Big Data #Azure #Schema Design #Spark SQL #Data Ingestion
Role description
About the Role: We are seeking a seasoned and highly experienced Microsoft Fabric Data Architect/Principal Data Engineer to lead the design, architecture, and deployment of complex data and analytics solutions on the Azure Data Platform and Microsoft Fabric. This role demands strong thought leadership in modern data architecture principles, driving technology transformation, and ensuring alignment with target state architecture and roadmaps. Key Responsibilities β€’ Serve as the key technical expert in designing, architecting, deploying, and maintaining cloud solutions for Data Management & Analytics. β€’ Lead the architecture direction for engineers, defining target state technology architecture and roadmaps. β€’ Design application solutions on the Azure Data Platform for complex business requirements. β€’ Demonstrate strong knowledge of ETL/ELT patterns for data ingestion, processing, and distribution using Azure cloud services and Microsoft Fabric. β€’ Drive technology proof of concepts and continuous technology transformation to minimize technical debt. Essential Skills β€’ 12+ years of proven experience with SQL, schema design, and dimensional data modeling. β€’ 5+ years in modern data engineering/data warehousing/data lakes technologies on Cloud Platforms (specifically Azure). β€’ Expertise in Microsoft Fabric Data Platform and Big Data tools (Databricks, Spark). β€’ Strong programming skills in PySpark and Spark SQL. β€’ Proficiency with ETL/ELT tools like Azure Data Factory (ADF) and data warehousing technologies (Azure Synapse, Azure SQL, Azure Data Lake Storage (ADLS)). Desirable Skills β€’ Strong knowledge of data warehouse best practices, development standards, and methodologies. β€’ Ability to work as an independent self-learner in a fast-paced, dynamic environment.