Microsoft Fabric Data Engineer (UT Local Only - USC Only)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Microsoft Fabric Data Engineer, requiring a Bachelor's degree and 3+ years of Azure-focused data engineering experience. Contract length is unspecified, with a pay rate of "unknown." Local candidates (UT) and USC only. Key skills include Microsoft Fabric, Azure Data Factory, SQL, and data integration.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date discovered
September 23, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Salt Lake City, UT
-
🧠 - Skills detailed
#Data Governance #Azure SQL #Computer Science #Data Lake #Security #Documentation #GitHub #BI (Business Intelligence) #SQL (Structured Query Language) #Data Pipeline #PySpark #Azure #Dataflow #Compliance #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Data Engineering #Azure DevOps #Delta Lake #Scala #Azure Synapse Analytics #Data Integration #ADF (Azure Data Factory) #Python #Synapse #DevOps #Data Ingestion #Cloud #Datasets #Storage #Azure Data Factory #Microsoft Power BI #Data Transformations
Role description
Key Responsibilities: β€’ Design, implement, and manage end-to-end data pipelines using Microsoft Fabric and Azure Data Factory. β€’ Develop data models and analytical datasets using Microsoft Fabric's OneLake, Power BI, and Dataflows. β€’ Integrate and optimize data ingestion, transformation, and storage across Azure services (e.g., Synapse, Data Lake, SQL, Delta Lake). β€’ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. β€’ Ensure best practices in data governance, security, compliance, and performance tuning. β€’ Monitor and troubleshoot data pipelines, ensuring reliability and scalability. β€’ Stay updated with Microsoft Fabric and Azure ecosystem advancements and propose improvements. β€’ Create and maintain documentation for data flows, architecture, and configuration settings. Required Qualifications: β€’ Bachelor’s degree in Computer Science, Information Systems, or a related field. β€’ 3+ years of experience in data engineering or cloud engineering with focus on Azure. β€’ Hands-on experience with Microsoft Fabric, including Data Engineering, Data Factory, and Power BI within Fabric. β€’ Proficiency in Azure Synapse Analytics, Azure Data Lake, Delta Lake, and Azure SQL. β€’ Strong knowledge of SQL, T-SQL, Python or PySpark for data transformations. β€’ Experience with data integration, ETL/ELT pipelines, and data warehousing concepts. β€’ Familiarity with DevOps practices and tools like Azure DevOps or GitHub Actions. β€’ Excellent problem-solving and communication skills.