Azure Data Architect-Microsoft Fabric

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Architect-Microsoft Fabric with a contract length of "unknown" and a pay rate of "unknown." Candidates need 15-20 years of experience, expertise in Microsoft Fabric, data architecture design, and strong communication skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Cloud #GDPR (General Data Protection Regulation) #Azure #Migration #Data Governance #Oracle #Scripting #Data Quality #Metadata #SQL (Structured Query Language) #GitHub #Scala #Azure DevOps #Batch #Data Engineering #Microsoft Power BI #Data Access #Data Catalog #Data Privacy #Synapse #Data Integrity #DevOps #Python #SQL Server #Automation #Data Warehouse #Agile #Data Architecture #Data Lake #Compliance #Data Processing #DAX #Data Integration #Physical Data Model #Security #Data Management #BI (Business Intelligence) #Dataflow #"ETL (Extract #Transform #Load)" #Data Pipeline #Teradata #Storage
Role description
This is a W2 only role (Strictly No C2C). We are looking for candidates with 15 to 20 years of experience. Job Description of Role β€’ a) Acts as the Single point of the contact for the client for technical delivery β€’ b) Design and develop robust data architecture that supports the organization's data needs, including data lake and data warehouse components β€’ c) Define Data Architecture framework components – Integration, Data Management and Data Consumption β€’ d) Create conceptual, logical, and physical data models to ensure data integrity and consistency β€’ e) Establish and enforce data governance policies and standards to maintain data quality and security β€’ f) Design and implement data integration solutions to extract, transform, and load data from various sources into the data lake and data warehouse β€’ g) Leverage Azure and Fabric services to build scalable and cost-effective data solutions β€’ h) Optimize data queries and ETL processes to ensure efficient data access and analysis β€’ i) Collaborate with data engineers, analysts, and business users to understand their requirements and translate them into technical solutions β€’ j) Should be able to understand requirements and proactively clarify any doubts/assumptions β€’ k) Should be team player β€’ l) Should be flexible with working hours (collaborate with team in India, in IST time zone) β€’ m) Should be self-motivated and passionate to become technology expert β€’ n) Should be accountable for his work and committed to deadlines β€’ o) Should be able to set a Implementation path for projects β€’ p) Should be able to do Effort Estimations for Projects β€’ q) Should be able to do customer demos β€’ Key skills:Β Microsoft Fabric, DBX, Data Lake, Data Factory, Azure, Data Warehouse, Modern Data Platforms Migration β€’ Primary (Must have skills) β€’ Excellent verbal and written communication should be able to speak confidently throughout the conversation. β€’ Experience working with customer Business/IT team. β€’ Data Architecture Design: β€’ 1.Expertise in designing scalable and modular data architectures (Data Lake + Data Warehouse) β€’ 2.Experience creating conceptual, logical, and physical data models β€’ Azure & Microsoft Fabric: β€’ 3.Strong knowledge of Microsoft Fabric components (Lakehouse, Dataflows, Pipelines, etc.) β€’ 4.Proficient with Azure services: Synapse, Data Factory, Data Lake Storage, SQL DB, etc. β€’ ETL/ELT & Data Integration: β€’ 5.Experience designing and implementing data pipelines for extraction, transformation, and loading from diverse sources β€’ 6.Familiarity with both batch and real-time data processing β€’ Data Governance & Management: β€’ 7.Understanding of data quality frameworks, metadata management, lineage, and access control β€’ 8.Ability to define and enforce governance policies β€’ Performance Optimization: β€’ 9.Tuning ETL jobs, optimizing data models, and improving query performance β€’ Stakeholder Communication: β€’ 10.Ability to translate business requirements into technical solutions β€’ 11.Experience serving as a single point of contact for clients on technical delivery β€’ Modern Data Platforms Migration: β€’ 12.Experience in modernizing legacy data warehouses to cloud-native platforms β€’ Well-architected Framework β€’ 13. Best design/coding practices β€’ 14. Experience in working in a challenging environment with unclear requirements and contributing collaboratively with the team to refine the requirements β€’ 15, Ability to offer innovative ideas/solutioning, that are modern and effective. β€’ Secondary Skills (Good To have) β€’ 1.Legacy DW Experience: Familiarity with traditional platforms like Teradata, Netezza, Oracle, or SQL Server. β€’ 2.Power BI: Working knowledge of Power BI for reporting, semantic model design, and DAX. β€’ 3.CI/CD & DevOps: Exposure to deploying data pipelines via Azure DevOps or GitHub Actions. β€’ 4.Security & Compliance: Understanding of data privacy, encryption, and regulatory standards (e.g., GDPR, HIPAA). β€’ 5.Data Catalogs: Experience with Microsoft Purview or other Metadata management tools. β€’ 6.Agile Methodology: Comfort with Agile project delivery and sprint-based planning. β€’ 7.Scripting & Automation: Ability to use SQL, PowerShell, or Python for automation or tooling support. β€’ Certifications: β€’ Preferred: Microsoft Certified: Azure Data Engineer Associate β€’ Additional beneficial certifications: DP-600 (Microsoft Fabric Analytics Engineer)/DP-700, DP-203 (Azure Data Engineer), Azure Solutions Architect Expert, or equivalent cloud/data certifications. β€’ Soft skills/other skills (If any) β€’ 1) Should have good oral and written communication. β€’ 2) Should be a good team player. β€’ 3) Should be proactive and adaptive. β€’ 4) Good experience of working with customer teams β€’ 5) Should have good attitude at work