

Burtch Works
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with a contract length of "unknown" and a pay rate of "unknown." It requires 5+ years of experience in ETL, Azure services, and Microsoft Fabric, along with strong SQL and data modeling skills. Location is hybrid in Chicago.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#SQL (Structured Query Language) #ADF (Azure Data Factory) #Data Governance #Azure #Datasets #Microsoft Azure #Data Quality #dbt (data build tool) #Microsoft Power BI #Scala #Synapse #Data Warehouse #Documentation #Airflow #Data Mart #BI (Business Intelligence) #CRM (Customer Relationship Management) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Modeling
Role description
Job Title: Data Engineer – Azure & Fabric
Location: Chicago (Hybrid) – On-site cadence flexible; occasional travel to collaborate with partner teams and portfolio companies.
About The Company
A U.S.-based private equity firm focused on partnering with lower middle-market industrial growth businesses. The firm invests in high-performing companies that can benefit from operational, technology, and strategic resources to scale and thrive. Portfolio companies span essential industrial sectors, creating opportunities to apply shared data/technology solutions that drive measurable enterprise value.
Job Summary
We are seeking a hands-on Data Engineer to design and build a modern data platform in Microsoft Azure and Fabric . This role will focus on developing ETL pipelines, integrating key data sources (ERP, CRM, HRIS, subscription data), and building a scalable Data Mart to support analytics, reporting, and decision-making across the firm and its portfolio companies. The ideal candidate thrives in an individual contributor role, enjoys working in a multi-entity environment, and is passionate about building high-quality, reliable data solutions.
Key Responsibilities
• Design, build, and maintain ETL pipelines in Azure and Microsoft Fabric using a medallion architecture (bronze/silver/gold) .
• Develop and optimize a centralized Data Mart to support business reporting, analytics, and self-service dashboards in Power BI.
• Integrate structured and unstructured data from ERP, CRM, HRIS, and subscription-based systems via APIs and connectors.
• Ensure data quality, lineage, and governance practices are followed to maintain trusted datasets.
• Automate workflows and monitor pipelines for reliability, scalability, and performance.
• Collaborate with analysts, business stakeholders, and portfolio companies to translate requirements into technical solutions.
• Document data models, processes, and pipelines to support long-term maintainability.
Requirements (Must-Haves)
• 5+ years of experience as a Data Engineer with strong expertise in ETL, Azure services, and Microsoft Fabric .
• Experience building Data Marts or Data Warehouses with medallion/lakehouse patterns.
• Strong SQL and data modeling skills with a focus on structured and semi-structured data.
• Hands-on experience integrating APIs and third-party data sources.
• Familiarity with Power BI and enabling data for self-service analytics.
• Strong problem-solving and documentation skills with a detail-oriented mindset.
Nice to Haves
• Experience harmonizing external datasets with internal sources.
• Exposure to data governance frameworks and best practices.
• Familiarity with modern approaches like dbt, orchestration tools (Airflow, ADF, Synapse Pipelines), or GenAI for data productivity.
Job Title: Data Engineer – Azure & Fabric
Location: Chicago (Hybrid) – On-site cadence flexible; occasional travel to collaborate with partner teams and portfolio companies.
About The Company
A U.S.-based private equity firm focused on partnering with lower middle-market industrial growth businesses. The firm invests in high-performing companies that can benefit from operational, technology, and strategic resources to scale and thrive. Portfolio companies span essential industrial sectors, creating opportunities to apply shared data/technology solutions that drive measurable enterprise value.
Job Summary
We are seeking a hands-on Data Engineer to design and build a modern data platform in Microsoft Azure and Fabric . This role will focus on developing ETL pipelines, integrating key data sources (ERP, CRM, HRIS, subscription data), and building a scalable Data Mart to support analytics, reporting, and decision-making across the firm and its portfolio companies. The ideal candidate thrives in an individual contributor role, enjoys working in a multi-entity environment, and is passionate about building high-quality, reliable data solutions.
Key Responsibilities
• Design, build, and maintain ETL pipelines in Azure and Microsoft Fabric using a medallion architecture (bronze/silver/gold) .
• Develop and optimize a centralized Data Mart to support business reporting, analytics, and self-service dashboards in Power BI.
• Integrate structured and unstructured data from ERP, CRM, HRIS, and subscription-based systems via APIs and connectors.
• Ensure data quality, lineage, and governance practices are followed to maintain trusted datasets.
• Automate workflows and monitor pipelines for reliability, scalability, and performance.
• Collaborate with analysts, business stakeholders, and portfolio companies to translate requirements into technical solutions.
• Document data models, processes, and pipelines to support long-term maintainability.
Requirements (Must-Haves)
• 5+ years of experience as a Data Engineer with strong expertise in ETL, Azure services, and Microsoft Fabric .
• Experience building Data Marts or Data Warehouses with medallion/lakehouse patterns.
• Strong SQL and data modeling skills with a focus on structured and semi-structured data.
• Hands-on experience integrating APIs and third-party data sources.
• Familiarity with Power BI and enabling data for self-service analytics.
• Strong problem-solving and documentation skills with a detail-oriented mindset.
Nice to Haves
• Experience harmonizing external datasets with internal sources.
• Exposure to data governance frameworks and best practices.
• Familiarity with modern approaches like dbt, orchestration tools (Airflow, ADF, Synapse Pipelines), or GenAI for data productivity.






