CONFLUX SYSTEMS

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with expertise in Microsoft Fabric and Snowflake. It is an on-site contract position, focusing on data platform design, ELT development, and governance. Required skills include PySpark, SQL, and advanced knowledge of cloud data solutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Madison, WI
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Clustering #"ETL (Extract #Transform #Load)" #Security #Monitoring #Batch #Databricks #Data Quality #Azure #Snowflake #Spark SQL #Storage #Metadata #Scala #Azure Data Factory #Data Modeling #PySpark #Data Engineering #SQL (Structured Query Language) #Infrastructure as Code (IaC) #Spark (Apache Spark) #Terraform #Cloud #Data Warehouse
Role description
Senior Data Engineer – Microsoft Fabric & Snowflake (ADF, Databricks) Only a local may apply. Role Summary Own end-to-end data platform design and implementation across Lakehouse and Cloud Data Warehouse environments. Deliver POCs and production-grade solutions with architecture evaluation, performance tuning, governance, and cost optimization. Key Responsibilities • Architecture: Design scalable Lakehouse/Warehouse solutions; implement Medallion (Bronze/Silver/Gold); evaluate Fabric vs Databricks vs Snowflake. • Ingestion & Orchestration: Build batch/CDC pipelines (MERGE, Streams, watermark); metadata-driven workflows; scheduling and monitoring. • Transformation: Develop ELT using PySpark and SQL; optimize storage (partitioning, clustering, Delta optimize); tune compute/concurrency. • Data Modeling: Star schema, surrogate keys, layered schemas (Raw → Curated), data quality enforcement. • Governance & Security: RBAC, lineage, masking, Unity Catalog/Fabric governance. • Performance & Cost: Throughput benchmarking; SKU/DBU/Credit optimization; workload isolation and scalability testing. Core Skills • Advanced: Microsoft Fabric, Azure Data Factory, Databricks (Delta, Unity Catalog), Snowflake (Warehouses, Streams/Tasks), PySpark, SQL • Strong: Medallion architecture, dimensional modeling, governance & security • Working: CI/CD, Infrastructure as Code (Terraform/ARM), CDC/streaming, capacity planning • Experience: Cross-platform architecture comparison and high-concurrency tuning