Senior Lead Azure Streaming Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Lead Azure Streaming Data Engineer, contract from 9/29/2025 to 6/30/2026, based in Minneapolis, MN. Pay rate is unspecified. Requires 7+ years in data engineering, Azure streaming expertise, and strong programming skills in Python or Scala.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 19, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Minneapolis, MN
-
🧠 - Skills detailed
#Security #MQTT (Message Queuing Telemetry Transport) #Data Pipeline #Leadership #Spark (Apache Spark) #Programming #Azure Synapse Analytics #Azure Stream Analytics #ADLS (Azure Data Lake Storage) #Azure DevOps #Visualization #Azure Event Hubs #C# #Data Engineering #Triggers #BI (Business Intelligence) #Data Lake #Azure #Microsoft Power BI #Data Architecture #Data Ingestion #YAML (YAML Ain't Markup Language) #REST (Representational State Transfer) #Compliance #Storage #Azure cloud #Data Processing #Delta Lake #Batch #REST API #Agile #Datasets #Deployment #Scala #Databricks #Microsoft Azure #DevOps #Cloud #Python #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Big Data #SQL (Structured Query Language) #IoT (Internet of Things) #Synapse #JavaScript #Terraform #PySpark
Role description
Senior/Lead Azure Streaming Data Engineer. Ideal candidate is a senior local resource. Mustb be based in Minneapolis, MN. US Citizen or US green Card holder The Sr./Lead Azure Streaming Data Engineer will design and build the real-time data ingestion and processing pipelines that power Client's situational awareness and operational dashboards. This senior contractor will collaborate with the Data Architect and lead on the design of streaming solutions within Azure ensuring that data from APIs, IoT sensors, and MQTT sources is captured, processed, and delivered to end users in seconds. The role balances hands-on engineering of streaming pipelines with solution design and best-practice governance for a secure, scalable Azure environment. You will work closely with the Data Architect, GIS team, and Operations to integrate real-time data feeds (with ArcGIS geospatial context handled by the GIS specialists) and ensure that streaming data is archived and made available in the Lakehouse (Delta Raw/Enriched tables) as well as pushed to live dashboards for immediate insights. Key Responsibilities: β€’ Design; Develop Real-Time Pipelines: Design and implement scalable, low- latency data pipelines to ingest real-time data from various sources (REST APIs, MQTT feeds, IoT sensors, etc.) using Azure streaming services such as Azure Event Hubs, Azure IoT Hub, Azure Stream Analytics, and Azure Functions. β€’ Stream Processing; Delta Lake Integration: Build streaming or micro-batch processing jobs (using Stream Analytics or Synapse Spark Structured Streaming) that cleanse, transform, and write event data into Delta Lake tables on ADLS Gen2 in the Raw and Enriched zones. Ensure that incoming data is captured in the Raw layer and promptly processed into Enriched for use in analytics. β€’ Real-Time Data Delivery: Enable real-time and near-real-time analytics by integrating streaming outputs with Power BI (GCC) real-time datasets and dashboards. Ensure that critical events (e.g. facility sensor alerts, airline flight alerts, passenger processing kiosk messages) are pushed to Power BI or other subscribers with minimal latency for situational awareness. Coordinate with the GIS team to incorporate location data (ArcGIS) into streaming insights when applicable (the GIS team will handle ArcGIS specifics, but this role will ensure data feeds include necessary spatial references). β€’ Azure Best Practices; Security: Apply Azure best practices in all solutions - including secure networking (VNet integration, private endpoints for Event Hubs/Storage), authentication/authorization (managed identities, RBAC), encryption, and scalable design (throughput units, partitioning strategies, etc.). Ensure compliance with any relevant security or regulatory standards in this public-sector environment while optimizing for performance. β€’ Collaboration; Leadership: Work closely with the Client's Data Architect to review designs and align with the overall lakehouse framework. Provide technical leadership on streaming to more junior engineers if needed, and collaborate with stakeholders in operations and IT to understand real-time data needs. Partake in an agile-lite development process (task refinement, sprint planning, demos) and document your solutions for knowledge sharing. Anticipated Project Start Date: 9/29/2025 Anticipated End Date: 6/30/2026 Required Qualifications/Skills: β€’ Strong Experience in Streaming Data Engineering: 7+ years of data engineering experience with at least 2-3 years focused on real-time/streaming data pipeline development. Proven expertise in Azure streaming technologies (Azure Event Hubs, IoT Hub, Azure Stream Analytics) and related patterns (pub/sub, event processing). β€’ Azure Cloud; Synapse Proficiency: Hands-on experience with Azure Synapse Analytics (or Azure Spark environments) to develop data pipelines. Ability to create and manage Synapse pipelines, triggers, and Synapse Spark notebooks for streaming or batch workflows. Familiarity with writing to and reading from Delta Lake storage. (Note: Databricks is not used in this environment, so experience with Synapse native Spark is required.) β€’ Programming; Data Processing: Proficiency in Python (PySpark) and/or Scala for Spark, plus experience authoring Azure Functions (in Python, C# or JavaScript) to handle streaming transformations or invoke APIs. Solid understanding of SQL for creating views or analyzing streaming results. β€’ Streaming Solution Design; Best Practices: Experience designing end-to-end streaming data solutions in Azure. Knowledge of networking, security, and scalability best practices (e.g., Event Hub partitioning, error handling and retry logic in Functions, scaling Stream Analytics jobs). Ability to optimize pipelines for high throughput and low latency. β€’ Data Lake and BI Integration: Experience writing data to data lakes in a structured format (Delta/Parquet) and integrating with BI tools. Understanding of how to output streaming data to Power BI (e.g., through Stream Analytics outputs or Power BI REST endpoints) or similar real-time visualization platforms. β€’ Communication; Leadership: Excellent communication skills to work with cross-functional teams. Ability to translate business requirements for real-time data into technical solutions. Experience in a lead or architect capacity on data projects is required, as this is a senior role needing self-direction and mentorship abilities. Preferred Qualifications: β€’ Certifications: Microsoft Azure certifications such as Azure Data Engineer and/or Azure Solutions Architect. β€’ IoT/Streaming Ecosystem: Familiarity with other streaming and IoT technologies (e.g., Kafka, MQTT brokers, Azure Event Grid, Azure Data Explorer) β€’ Big Data; DevOps: Experience with big data file formats and tools (Apache Parquet/Delta, Kafka, Spark) and implementing CI/CD for data pipelines (Azure DevOps Pipelines, YAML deployments). Knowledge of infrastructure-as-code (ARM/Bicep or Terraform) for deploying data services is a plus. β€’ Industry Experience: Prior experience in secure or regulated environments (government agencies, utilities, healthcare, etc.) dealing with sensitive data. Understanding of compliance requirements and how to design data solutions within those constraints.