

Robert Half
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a contract-to-hire Data Engineer position focused on building data pipelines in a Microsoft Fabric environment. Requires 5+ years of experience, strong SQL skills, and expertise with ERP data integration. Pay rate and location details are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater Houston
-
🧠 - Skills detailed
#Data Pipeline #ADF (Azure Data Factory) #Scala #Oracle #"ETL (Extract #Transform #Load)" #DAX #Code Reviews #Azure Data Factory #DevOps #Azure DevOps #GitHub #Databricks #Monitoring #Consulting #SAP #Dataflow #BI (Business Intelligence) #Synapse #Semantic Models #Azure #SQL (Structured Query Language) #Data Quality #Data Lineage #Datasets #Microsoft Power BI #Data Engineering
Role description
Position Overview
We are seeking an experienced Data Engineer to join our Enterprise Business Intelligence team. This is a hands-on role where you will be immediately contributing to active projects in a Microsoft Fabric environment, building and maintaining data pipelines that integrate data from 16 ERP systems and other enterprise sources. You will partner closely with our existing data engineer, BI analysts, and business stakeholders to deliver trusted, scalable data solutions that drive business decisions. This position is contract-to-hire, with the expectation that the right candidate can hit the ground running on day one.
Key Responsibilities
• Design, build, and maintain scalable data pipelines and ELT/ETL processes within Microsoft Fabric (Lakehouse, Warehouse, Dataflows Gen2, Pipelines, and Notebooks).
• Integrate and harmonize data from 16 ERP systems and other source systems into curated, analytics-ready datasets.
• Develop and optimize data models (medallion architecture: bronze, silver, gold) to support self-service analytics and Power BI reporting.
• Write efficient, well-tested SQL for data transformation and orchestration.
• Implement data quality checks, monitoring, alerting, and governance practices to ensure reliability and trust in the data platform.
• Manage source control, branching, pull requests, and CI/CD pipelines using Azure DevOps and GitHub.
• Collaborate with the BI analyst team to translate business requirements into performant data structures and semantic models.
• Document data lineage, definitions, and processes to support transparency and knowledge sharing across the team.
• Participate in code reviews, sprint planning, and architecture discussions; contribute to continuous improvement of engineering standards.
Required Qualifications
• 5+ years of professional data engineering experience building production-grade pipelines and data platforms.
• Strong hands-on experience with Microsoft Fabric (or Azure Synapse / Azure Data Factory / Databricks with a clear willingness and ability to transfer skills to Fabric immediately).
• Expert-level SQL skills, including performance tuning, window functions, and complex transformations.
• Solid understanding of dimensional modeling, medallion architecture, and modern data warehousing concepts.
• Experience integrating data from ERP systems (e.g., SAP, Dynamics, Oracle, IFS, QAD, or similar).
• Working knowledge of Azure DevOps and/or GitHub for source control, pull requests, and CI/CD.
• Strong troubleshooting skills and the ability to operate independently in a fast-moving environment.
• Excellent communication skills with the ability to explain technical concepts to business partners.
Preferred Qualifications
• Experience working in environments with multiple, heterogeneous ERP systems.
• Familiarity with Power BI semantic models, DAX, and downstream analytics consumption patterns.
• Prior contract-to-hire or consulting experience with rapid onboarding into active project work.
Position Overview
We are seeking an experienced Data Engineer to join our Enterprise Business Intelligence team. This is a hands-on role where you will be immediately contributing to active projects in a Microsoft Fabric environment, building and maintaining data pipelines that integrate data from 16 ERP systems and other enterprise sources. You will partner closely with our existing data engineer, BI analysts, and business stakeholders to deliver trusted, scalable data solutions that drive business decisions. This position is contract-to-hire, with the expectation that the right candidate can hit the ground running on day one.
Key Responsibilities
• Design, build, and maintain scalable data pipelines and ELT/ETL processes within Microsoft Fabric (Lakehouse, Warehouse, Dataflows Gen2, Pipelines, and Notebooks).
• Integrate and harmonize data from 16 ERP systems and other source systems into curated, analytics-ready datasets.
• Develop and optimize data models (medallion architecture: bronze, silver, gold) to support self-service analytics and Power BI reporting.
• Write efficient, well-tested SQL for data transformation and orchestration.
• Implement data quality checks, monitoring, alerting, and governance practices to ensure reliability and trust in the data platform.
• Manage source control, branching, pull requests, and CI/CD pipelines using Azure DevOps and GitHub.
• Collaborate with the BI analyst team to translate business requirements into performant data structures and semantic models.
• Document data lineage, definitions, and processes to support transparency and knowledge sharing across the team.
• Participate in code reviews, sprint planning, and architecture discussions; contribute to continuous improvement of engineering standards.
Required Qualifications
• 5+ years of professional data engineering experience building production-grade pipelines and data platforms.
• Strong hands-on experience with Microsoft Fabric (or Azure Synapse / Azure Data Factory / Databricks with a clear willingness and ability to transfer skills to Fabric immediately).
• Expert-level SQL skills, including performance tuning, window functions, and complex transformations.
• Solid understanding of dimensional modeling, medallion architecture, and modern data warehousing concepts.
• Experience integrating data from ERP systems (e.g., SAP, Dynamics, Oracle, IFS, QAD, or similar).
• Working knowledge of Azure DevOps and/or GitHub for source control, pull requests, and CI/CD.
• Strong troubleshooting skills and the ability to operate independently in a fast-moving environment.
• Excellent communication skills with the ability to explain technical concepts to business partners.
Preferred Qualifications
• Experience working in environments with multiple, heterogeneous ERP systems.
• Familiarity with Power BI semantic models, DAX, and downstream analytics consumption patterns.
• Prior contract-to-hire or consulting experience with rapid onboarding into active project work.






