

Fabric Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Fabric Data Engineer on a contract-to-hire basis, requiring 3+ years of data engineering experience and expertise in Microsoft Fabric, SQL, and Python. Must be a US Citizen or GC Holder; hybrid work with monthly travel.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Code Reviews #Data Accuracy #Microsoft Power BI #Data Pipeline #SQL (Structured Query Language) #DevOps #Data Warehouse #Data Governance #Agile #Synapse #Compliance #Spark (Apache Spark) #Data Science #PySpark #Python #Security #Data Engineering #Scala #Azure #GitHub #BI (Business Intelligence) #Data Integration #KQL (Kusto Query Language) #Semantic Models #"ETL (Extract #Transform #Load)" #Dataflow #Azure DevOps #Azure cloud #Storage #Computer Science #Data Storage #Delta Lake #Cloud #Documentation #DAX
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
NO SPONSORSHIP β GC/USC Only. Candidates must be able to work directly for ANSIT and/or Client.
Our Client, an Aggrotech company with offices in US, Canada, and Mexico is hiring a talented Microsoft Fabric Data Engineer. The client has an on-prem data center and Azure Cloud services.
Unable to Sponsor and Company is looking for US Citizens or GC Holders ONLY.
This is a Contract to Hire role.
Responsibilities
β’ Design, build, and maintain scalable data pipelines and solutions using Microsoft Fabric components.
β’ Develop and manage data integration workflows using Pipelines, Notebook, Dataflows, and Synapse.
β’ Optimize data storage and retrieval using OneLake, Delta Lake, and Lakehouse architecture.
β’ Collaborate with data scientists, analysts, and BI developers to ensure data accuracy and accessibility.
β’ Implement robust data governance, security, and compliance practices using Fabricβs built-in tools.
β’ Monitor and troubleshoot data workflows and performance issues.
β’ Participate in code reviews, solution architecture discussions, and agile ceremonies.
β’ Create and maintain documentation for data models, processes, and configurations.
Qualifications
Required:
β’ Bachelorβs degree in computer science, Information Systems, or a related field.
β’ 3+ years of experience in data engineering or a similar role.
β’ Hands-on experience with Microsoft Fabric ecosystem including Synapse, Dataflows, Power BI, Semantic models, Data Warehouse and OneLake.
β’ Proficiency in SQL, Python or PySpark, KQL, and DAX.
β’ Strong SQL query optimization and troubleshooting skills
β’ Experience working with Lakehouse architectures, delta lake tables, and Real-time intelligence from streaming data.
β’ Strong understanding of data warehousing design and best practices, ETL/ELT pipelines, and cloud data platforms (preferably Azure).
β’ Familiarity with CI/CD practices in data engineering.
Preferred:
β’ Microsoft Certified: Fabric Analytics Engineer Associate or similar.
β’ Experience with GitHub, Azure DevOps, and other development lifecycle tools.
β’ Knowledge of data governance frameworks and tools (e.g., Microsoft Purview).
β’ Excellent communication and collaboration skills.
Travel and Work Requirements
β’ Willingness to travel to Client's offices in the US (once a month). Client has 18 offices in US.
β’ Able to lift up to 50 lbs., if necessary.
β’ Ability to work for extended periods or hours if necessary and as needed.