The Planet Group

Data Engineer (MS Fabric)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Microsoft Fabric) with a contract length of over 6 months, offering a pay rate of $110,000 – $120,000. Required skills include 2–4+ years of Microsoft Fabric experience, ETL expertise, and proficiency in SQL, Python, and PySpark.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
545
-
🗓️ - Date
May 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Microsoft Azure #PySpark #Data Integrity #Azure #ADF (Azure Data Factory) #CRM (Customer Relationship Management) #SQL (Structured Query Language) #Cloud #Azure Data Factory #Data Pipeline #Scala #Python #Spark (Apache Spark) #Data Engineering #Data Integration #AI (Artificial Intelligence)
Role description
Data Engineer (Microsoft Fabric) Salary: $110,000 – $120,000 Employment Type: Full-Time and Remote opportunity (EST hours) Authorization: No sponsorship or C2C available in any form. Role Overview We are seeking a hands-on Data Engineer to expand and optimize our enterprise Microsoft Fabric platform. This role focuses on transitioning a consultant-led environment into a robust internal operation, handling end-to-end ETL development, lakehouse architecture, and system integrations. Key Responsibilities • Pipeline Development: Design and maintain scalable data pipelines using Microsoft Fabric and Azure Data Factory. • System Integration: Build ETL/ELT workflows to sync enterprise systems (ERP, CRM, and operational platforms). • Platform Optimization: Enhance existing lakehouse solutions, ensuring performance, reliability, and data integrity. • Collaboration: Partner with internal teams and external consultants to establish data engineering standards and best practices. Required Qualifications • 2–4+ years of hands-on experience with Microsoft Fabric in an enterprise setting. • ETL Expertise: Proven track record building complex pipelines with Azure Data Factory. • Technical Stack: Proficient in SQL, Python, and PySpark. • Architecture: Strong understanding of Lakehouse architecture, APIs, and cloud data integration. • Mindset: Proactive, autonomous, and comfortable using AI tools (Copilot, Claude) to accelerate workflows. Preferred Skills • Familiarity with enterprise integrations (e.g., Infor M3, Salesforce, or Anaplan). • Relevant Microsoft Azure or Fabric certifications. #DataEngineering #MicrosoftFabric #Azure #ETL #PySpark #Hiring #Remote #TECH