Axiom Global Technologies

Microsoft Fabric Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Microsoft Fabric Data Engineer, contract length unspecified, offering a competitive pay rate. Key skills include Microsoft Fabric, Azure cloud services, data modeling, and CI/CD implementation. Experience with Power BI integration and modern data architecture is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Microsoft Azure #Data Management #Data Architecture #Observability #Data Pipeline #Microsoft Power BI #Data Storage #Azure DevOps #Cloud #Data Modeling #Monitoring #Deployment #Azure #DevOps #Scala #Data Engineering #Metadata #BI (Business Intelligence) #Azure cloud #Data Processing #Storage #GitLab
Role description
We are seeking an experienced Microsoft Fabric Data Engineer to design, build, and optimize scalable data solutions using Microsoft Fabric and OneLake. The role involves developing efficient data pipelines, implementing modern data architecture, and enabling seamless integration with Power BI for analytics and reporting. Key Responsibilities • Design and implement robust data pipelines and data models using Microsoft Fabric and OneLake. • Develop high-performance data solutions leveraging columnar storage formats such as Parquet and efficient serialization formats like Protobuf. • Work closely with Product Owners, Azure Solution Architects, Data Engineers, and Software Engineers to build and maintain a Fabric-based enterprise data platform integrated with Power BI. • Implement modern data architecture patterns, including Lakehouse architecture, near real-time data streaming, schema evolution, and event-driven data processing. • Build, manage, and automate scalable data pipelines, ensuring high performance, consistent metadata management, and optimal use of storage formats. • Apply DevOps best practices, including the implementation of CI/CD pipelines using Azure DevOps or GitLab and Infrastructure-as-Code for automated deployments. • Monitor, troubleshoot, and enhance platform performance using observability and monitoring tools to maintain reliability and cost efficiency. • Support cloud transformation initiatives, assisting in migrating and modernizing data platforms to Microsoft Azure and Microsoft Fabric environments. Required Qualifications • Hands-on experience with Microsoft Fabric and OneLake • Strong knowledge of Parquet, Protobuf, and modern data storage and serialization formats • Understanding of Lakehouse architecture and modern data engineering practices • Experience working with Azure cloud services and data platforms • Experience implementing CI/CD pipelines using Azure DevOps or GitLab • Solid understanding of data modeling, ETL/ELT processes, and streaming data pipelines • Experience integrating data platforms with Power BI for analytics and reporting