LanceSoft, Inc.

Microsoft Fabric Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Microsoft Fabric Data Engineer in Piscataway, NJ, on a 6+ month contract at $53/hr. Requires 9+ years of experience, strong skills in Microsoft Fabric, Azure services, SQL, Python, and data governance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
424
-
🗓️ - Date
April 22, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Piscataway, NJ
-
🧠 - Skills detailed
#IoT (Internet of Things) #Dataflow #Storage #Visualization #Synapse #BI (Business Intelligence) #Data Ingestion #Data Warehouse #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Transformations #Data Science #Data Modeling #Azure #Data Pipeline #Data Engineering #Azure SQL #Data Governance #Scrum #ADLS (Azure Data Lake Storage) #Security #Data Integration #DevOps #Agile #Python #GIT #Data Processing #Compliance #Microsoft Power BI #Data Lake #Data Lakehouse #Azure ADLS (Azure Data Lake Storage) #GitHub #Azure DevOps #Version Control #Data Quality #Datasets
Role description
• • Local Candidates available for In-Person Interview Only with 9+ Years of Experience. Role: Microsoft Fabric Data Engineer Location: Piscataway, NJ (Onsite) Hiring Mode: Contract 6+ months • • Local Candidates available for In-Person Interview Only with 9+ Years of Experience. Competencies: 4-6+ years experience Pay: $53/Hr. Essential Skills: • Strong experience in Microsoft Fabric (Data Factory, Data Engineering, Data Warehouse, Data Science, Real-Time Analytics, and Power BI integration). • Hands-on experience with Data Pipelines, Dataflows, and Lakehouse concepts. • Expertise in Azure Synapse, Azure Data Lake Storage (ADLS), and Azure SQL. • Proficiency in Data Modeling, ETLELT design, and Data Integration processes. • Good understanding of Real-Time Data Streaming (using Event Hubs, IoT Hub, or Kusto). • Experience with bulk data processing and performance optimization for large datasets. • Proficiency in SQL and Python (for data transformations or orchestration logic). • Understanding of data governance, security, and compliance in Microsoft Fabric. • Familiarity with Power BI for analytics and visualization. • Experience with CICD pipelines for Fabric (using Azure DevOps or GitHub Actions) • Strong hands-on experience building Fabric data pipelines, dataflows, and data lakehouses. • Experience with Real-Time data ingestion (Eventstreams, Kusto queries). • Ability to handle large-scale data ingestion and bulk data transformations. • Experience integrating Fabric Data Warehouse with Power BI or downstream systems. • Familiarity with version control (Git) and working in AgileScrum environments. • Solid understanding of data quality, validation, and error handling processes. • • Local Candidates available for In-Person Interview Only with 9+ Years of Experience.