HashRoot

Data Engineer (Microsoft Fabric)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Microsoft Fabric) in Newark, US, for an immediate joiner. Contract length is unspecified, with a pay rate of "unknown." Requires 7+ years of experience, expertise in Microsoft Fabric, PySpark, SQL, and Azure integration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Data Quality #"ETL (Extract #Transform #Load)" #Semantic Models #Oracle #PySpark #SQL (Structured Query Language) #Data Engineering #Azure #Spark (Apache Spark) #Capacity Management #Dataflow #Data Transformations #Data Modeling
Role description
Position: Data Engineer (Microsoft Fabric) Experience: 7+ years Locations: Newark, US Notice Period: Immediate Joiners Job Overview We are looking for a Microsoft Fabric Data Engineer to accelerate a backlog of data engineering initiatives. The role involves building ingestion pipelines, implementing transformation logic, and ensuring data readiness for financial reporting workflows. This position is critical for enabling application modernization efforts to integrate cleanly with Fabric. Key Responsibilities • Design and build ingestion pipelines in Microsoft Fabric • Develop data transformations using Spark / PySpark • Build and manage Lakehouse / Warehouse structures • Implement Dataflows Gen2 • Design data quality and reconciliation frameworks • Optimize Fabric capacity usage and performance • Integrate ACA applications with Fabric pipelines • Collaborate with application teams to define data contracts • Monitor Fabric using Monitor Hub & capacity metrics • Ensure data readiness SLAs are met Key Competencies Required Skills Fabric Expertise • Lakehouse architecture • Pipelines • Notebooks (Spark) • Semantic models • Delta format • Dataflows Gen2 • Capacity management Data Engineering • PySpark • SQL • Data modeling • ETL/ELT design • Data validation frameworks Azure Integration • Event Hub / Service Bus • Managed Identity • Secure integration patterns Nice to Have • Experience in integrating Salesforce data • Exposure to Oracle / accounting system integrations