Artmac

Senior Data Engineer - Microsoft Fabric & Azure Analytics

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in Microsoft Fabric and Azure Analytics, located in Redmond, Washington. The contract is W2/C2C, requiring 9-18 years of experience, expertise in Azure technologies, and strong skills in SQL, Python, and data engineering.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 3, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Redmond, WA
-
🧠 - Skills detailed
#Azure ADLS (Azure Data Lake Storage) #Azure SQL #Security #Storage #XML (eXtensible Markup Language) #ADLS (Azure Data Lake Storage) #Data Lake #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #BI (Business Intelligence) #Microsoft Azure #Data Pipeline #Microsoft Power BI #Scala #Azure #Dataflow #Data Engineering #ADF (Azure Data Factory) #Data Modeling #Datasets #Metadata #SQL (Structured Query Language) #Data Quality #Python #Consulting #PySpark #JSON (JavaScript Object Notation) #Azure Data Factory #Visualization #Programming #AI (Artificial Intelligence)
Role description
Who We Are Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to Customers. Job Description Job Title : Senior Data Engineer – Microsoft Fabric & Azure Analytics Job Type : W2/C2C Experience : 9 – 18 years Location : Redmond, Washington Responsibilities β€’ 7+ years of experience in data engineering with a focus on Microsoft Azure Data stack and Fabric technologies. β€’ Expertise in Microsoft Fabric: Lakehouse, Dataflows Gen2, Pipelines, Notebooks. β€’ Strong experience with Azure Data Factory, Azure SQL, and Azure Data Lake Storage Gen2. β€’ Experience working with structured and semi-structured data (JSON, XML, CSV, Parquet). β€’ Experience with conversational AI platforms or integration with AI-driven analytics. β€’ Knowledge of multi-agent orchestration and federated query systems. β€’ Strong track record of scaling data pipelines and optimizing compute/storage performance. β€’ Proven ability to build metadata-driven architectures and reusable components. β€’ Proficiency in Power BI or other visualization and reporting tools. β€’ Strong programming skills in SQL, Python, PySpark, and/or Scala. β€’ Familiarity with agent-based architectures and conversational AI integration is a plus. β€’ Solid understanding of data modeling, governance, security, and best practices. β€’ Design, implement, and manage ETL/ELT pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse, Warehouse, SQL) and Azure Data Factory. β€’ Build and maintain metadata-driven Lakehouse architectures with threaded datasets to support multiple data consumption patterns. β€’ Develop agent-specific data lakes and an orchestration layer for a central β€œuber” agent capable of querying across agents to answer customer questions. β€’ Enable interactive analytics and AI-driven insights via Power BI, Azure OpenAI, and other tools. β€’ Ensure data quality, lineage, security, and governance across all ingestion and transformation processes. β€’ Collaborate closely with product, analytics, and AI teams to understand business requirements and deliver scalable solutions. β€’ Optimize storage, compute, and pipeline performance while balancing cost-efficiency. β€’ Document best practices, standards, and reusable components to enable long-term maintainability. Qualification β€’ Bachelor's degree or equivalent combination of education and experience.