MSH

Fabric Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Fabric Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include Azure Data Factory, Spark, ETL, and data quality. Experience in Agile methodologies and Delta Lake is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Los Angeles, CA
-
🧠 - Skills detailed
#Spark (Apache Spark) #Delta Lake #"ETL (Extract #Transform #Load)" #Agile #Data Ingestion #Datasets #UAT (User Acceptance Testing) #Data Framework #Data Quality #Dataflow #Monitoring #Documentation #Data Engineering
Role description
Builds, maintains, and optimizes data ingestion pipelines, Lakehouse tables, transformation logic, data quality checks, and operational workflows within Microsoft Fabric. Key Responsibilities: • Architect and maintain medallion data frameworks. • Build ingestion pipelines using Data Factory, Spark Notebooks, Dataflows Gen2, Event Streams, and Lakehouse connectors. • Develop Silver/Gold transformations including cleansing, enrichment, merging, and standardization. • Implement data quality checks, validation rules, auditing columns, and error-handling patterns. • Optimize Lakehouse and Delta Lake operations (schema evolution, partitioning, Z-Order, caching, vacuum, compaction). • Support production operations including monitoring, incident triage, defect resolution, and performance tuning. • Create/Maintain mapping sheets, schema documentation, lineage diagrams, and runbooks. • Coordinate with analytics teams to prepare model-ready Gold datasets. • Participate in Agile ceremonies, design reviews, and technical planning. • Support UAT, test data preparation, and validation activities.