

Microsoft Fabric Analytics Engineer (W2 Contract)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Microsoft Fabric Analytics Engineer (W2 Contract) based in Salt Lake City, UT (Hybrid – 2x/week). Required skills include Azure Cosmos DB, Azure Synapse Analytics, Power BI, and advanced SQL. Certifications in Azure Data Analyst, Azure Data Engineer, and Azure Cosmos DB Developer are preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Salt Lake City, UT
-
🧠 - Skills detailed
#Deployment #Visualization #Azure Data Factory #Datasets #ADF (Azure Data Factory) #Spark (Apache Spark) #Synapse #Azure Cosmos DB #Data Analysis #Data Cleansing #DevOps #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Data Integration #GIT #Microsoft Power BI #JSON (JavaScript Object Notation) #BI (Business Intelligence) #Azure Synapse Analytics #Cloud #API (Application Programming Interface) #SQL (Structured Query Language) #Azure #Containers #SQL Queries #PySpark #NoSQL #Data Transformations #Azure DevOps #Data Pipeline #DAX #Scala #Data Engineering #Data Modeling
Role description
Role: Microsoft Fabric Analytics Engineer
Location: Salt Lake City, UT (Hybrid – 2x/week)
Local Candidates Only
•
•
• Skill Requirements:
Skill Type Skill Name Experience / Details
Skill Azure Cosmos DB 2–3 years hands-on, specifically with NoSQL API and complex JSON data
Skill Azure Synapse Analytics 3–5 years, focus on Synapse Link and T-SQL
Skill Synapse Serverless SQL Pools 3–5 years, focus on Synapse Link and T-SQL
Skill Data Integration (Azure Data Factory) 3–5 years building and managing data pipelines
Skill Power BI 5+ years as a Power BI expert; advanced data modeling, DAX, Power Query (M Language), report & dashboard development
Skill SQL 5+ years advanced SQL for querying and manipulation
Other Collaborative, cross-functional team experience Proven ability to work in a collaborative environment
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Preferred Skills
Skill Type Skill Name
Skill: Spark Pools
Skill: Git or Azure DevOps for code management
Skill: AWS or Google Cloud
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Certifications
Certification Type Certification Name
Certification: Azure Data Analyst Associate (PL-300)
Certification: Azure Data Engineer Associate (DP-203)
Certification: Azure Cosmos DB Developer Specialty
Job Requirements
· Azure Cosmos DB: This is the foundational skill. The contractor must have knowledge of Cosmos DB, including its different APIs (especially for NoSQL), and a strong understanding of how to query and handle complex, unstructured JSON data.
· Azure Synapse Analytics: This is the central hub for the reporting solution. The contractor needs to be proficient in Azure Synapse, particularly in using Synapse Link for Cosmos DB to enable near-real-time analytics without impacting transactional workloads. This includes:
· Synapse Serverless SQL Pools: The ability to write T-SQL queries against the Cosmos DB analytical store.
· Data Integration (Azure Data Factory): Experience with building data pipelines to ingest, transform, and load data from various sources into Synapse.
· Power BI: This is the reporting layer. The person must be an expert in Power BI, including:
· Data Modeling: Strong skills in creating efficient data models, including star schemas, to handle a large volume of data and complex relationships.
· DAX (Data Analysis Expressions): Proficiency in writing complex DAX measures, calculated columns, and tables to create meaningful metrics and insights.
· Power Query (M Language): Experience with data cleansing, shaping, and transformation in the Power Query editor.
· Report & Dashboard Development: The ability to design and build interactive, visually appealing, and high-performance reports and dashboards.
· SQL: A solid understanding of SQL is critical for querying and manipulating data in Synapse.
Key Responsibilities
· The contractor should be able to handle the entire project lifecycle, from initial design to final deployment. Their responsibilities should include:
· Solution Architecture: Designing a scalable and cost-effective reporting solution on Azure that leverages Cosmos DB, Azure Synapse, and Power BI.
· Data Engineering: Setting up and configuring Azure Synapse Link for your Cosmos DB containers and developing data pipelines to prepare data for reporting. This includes creating T-SQL views in Synapse Serverless SQL pools.
· Business Intelligence Development: Developing Power BI reports and dashboards, including data models, DAX measures, and visualizations, to meet complex reporting requirements.
· Performance Tuning: Optimizing queries in Synapse and DAX calculations in Power BI to ensure fast report load times and data refreshes.
· Stakeholder Collaboration: Working with you and other stakeholders to understand business requirements and translate them into technical specifications and reports.
Desirable Qualifications
· Experience with large datasets: Proven experience working with large-scale data and optimizing solutions for performance and cost.
· Communication Skills: The ability to clearly explain complex technical concepts to non-technical business users.
· Spark Pools (nice-to have): Familiarity with PySpark or other languages for more complex data transformations within Synapse.
Role: Microsoft Fabric Analytics Engineer
Location: Salt Lake City, UT (Hybrid – 2x/week)
Local Candidates Only
•
•
• Skill Requirements:
Skill Type Skill Name Experience / Details
Skill Azure Cosmos DB 2–3 years hands-on, specifically with NoSQL API and complex JSON data
Skill Azure Synapse Analytics 3–5 years, focus on Synapse Link and T-SQL
Skill Synapse Serverless SQL Pools 3–5 years, focus on Synapse Link and T-SQL
Skill Data Integration (Azure Data Factory) 3–5 years building and managing data pipelines
Skill Power BI 5+ years as a Power BI expert; advanced data modeling, DAX, Power Query (M Language), report & dashboard development
Skill SQL 5+ years advanced SQL for querying and manipulation
Other Collaborative, cross-functional team experience Proven ability to work in a collaborative environment
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Preferred Skills
Skill Type Skill Name
Skill: Spark Pools
Skill: Git or Azure DevOps for code management
Skill: AWS or Google Cloud
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Certifications
Certification Type Certification Name
Certification: Azure Data Analyst Associate (PL-300)
Certification: Azure Data Engineer Associate (DP-203)
Certification: Azure Cosmos DB Developer Specialty
Job Requirements
· Azure Cosmos DB: This is the foundational skill. The contractor must have knowledge of Cosmos DB, including its different APIs (especially for NoSQL), and a strong understanding of how to query and handle complex, unstructured JSON data.
· Azure Synapse Analytics: This is the central hub for the reporting solution. The contractor needs to be proficient in Azure Synapse, particularly in using Synapse Link for Cosmos DB to enable near-real-time analytics without impacting transactional workloads. This includes:
· Synapse Serverless SQL Pools: The ability to write T-SQL queries against the Cosmos DB analytical store.
· Data Integration (Azure Data Factory): Experience with building data pipelines to ingest, transform, and load data from various sources into Synapse.
· Power BI: This is the reporting layer. The person must be an expert in Power BI, including:
· Data Modeling: Strong skills in creating efficient data models, including star schemas, to handle a large volume of data and complex relationships.
· DAX (Data Analysis Expressions): Proficiency in writing complex DAX measures, calculated columns, and tables to create meaningful metrics and insights.
· Power Query (M Language): Experience with data cleansing, shaping, and transformation in the Power Query editor.
· Report & Dashboard Development: The ability to design and build interactive, visually appealing, and high-performance reports and dashboards.
· SQL: A solid understanding of SQL is critical for querying and manipulating data in Synapse.
Key Responsibilities
· The contractor should be able to handle the entire project lifecycle, from initial design to final deployment. Their responsibilities should include:
· Solution Architecture: Designing a scalable and cost-effective reporting solution on Azure that leverages Cosmos DB, Azure Synapse, and Power BI.
· Data Engineering: Setting up and configuring Azure Synapse Link for your Cosmos DB containers and developing data pipelines to prepare data for reporting. This includes creating T-SQL views in Synapse Serverless SQL pools.
· Business Intelligence Development: Developing Power BI reports and dashboards, including data models, DAX measures, and visualizations, to meet complex reporting requirements.
· Performance Tuning: Optimizing queries in Synapse and DAX calculations in Power BI to ensure fast report load times and data refreshes.
· Stakeholder Collaboration: Working with you and other stakeholders to understand business requirements and translate them into technical specifications and reports.
Desirable Qualifications
· Experience with large datasets: Proven experience working with large-scale data and optimizing solutions for performance and cost.
· Communication Skills: The ability to clearly explain complex technical concepts to non-technical business users.
· Spark Pools (nice-to have): Familiarity with PySpark or other languages for more complex data transformations within Synapse.