ANSIT INC

Fabric Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Fabric Data Engineer on a contract-to-hire basis, requiring 3+ years of data engineering experience and proficiency in Microsoft Fabric, SQL, and Python. Pay is competitive; work is hybrid with monthly travel to client offices.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 10, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Science #Azure #Azure cloud #Data Warehouse #Data Accuracy #Data Governance #BI (Business Intelligence) #Spark (Apache Spark) #Data Storage #Scala #Microsoft Power BI #Agile #Python #Code Reviews #Semantic Models #Compliance #Cloud #Azure DevOps #"ETL (Extract #Transform #Load)" #Synapse #DevOps #GitHub #Documentation #PySpark #DAX #KQL (Kusto Query Language) #Dataflow #Storage #Data Engineering #Data Pipeline #Delta Lake #Data Integration #Security #SQL (Structured Query Language) #Computer Science
Role description
NO SPONSORSHIP β€” GC/USC Only. Candidates must be able to work directly for ANSIT and/or Client. Our Client, an Aggrotech company with offices in US, Canada, and Mexico is hiring a talented Microsoft Fabric Data Engineer. The client has an on-prem data center and Azure Cloud services. Unable to Sponsor and Company is looking for US Citizens or GC Holders ONLY. This is a Contract to Hire role. Responsibilities β€’ Design, build, and maintain scalable data pipelines and solutions using Microsoft Fabric components. β€’ Develop and manage data integration workflows using Pipelines, Notebook, Dataflows, and Synapse. β€’ Optimize data storage and retrieval using OneLake, Delta Lake, and Lakehouse architecture. β€’ Collaborate with data scientists, analysts, and BI developers to ensure data accuracy and accessibility. β€’ Implement robust data governance, security, and compliance practices using Fabric’s built-in tools. β€’ Monitor and troubleshoot data workflows and performance issues. β€’ Participate in code reviews, solution architecture discussions, and agile ceremonies. β€’ Create and maintain documentation for data models, processes, and configurations. Qualifications Required: β€’ Bachelor’s degree in computer science, Information Systems, or a related field. β€’ 3+ years of experience in data engineering or a similar role. β€’ Hands-on experience with Microsoft Fabric ecosystem including Synapse, Dataflows, Power BI, Semantic models, Data Warehouse and OneLake. β€’ Proficiency in SQL, Python or PySpark, KQL, and DAX. β€’ Strong SQL query optimization and troubleshooting skills β€’ Experience working with Lakehouse architectures, delta lake tables, and Real-time intelligence from streaming data. β€’ Strong understanding of data warehousing design and best practices, ETL/ELT pipelines, and cloud data platforms (preferably Azure). β€’ Familiarity with CI/CD practices in data engineering. Preferred: β€’ Microsoft Certified: Fabric Analytics Engineer Associate or similar. β€’ Experience with GitHub, Azure DevOps, and other development lifecycle tools. β€’ Knowledge of data governance frameworks and tools (e.g., Microsoft Purview). β€’ Excellent communication and collaboration skills. Travel and Work Requirements β€’ Willingness to travel to Client's offices in the US (once a month). Client has 18 offices in US. β€’ Able to lift up to 50 lbs., if necessary. β€’ Ability to work for extended periods or hours if necessary and as needed.