

Catapult Solutions Group
Azure DevOps Engineer (Mid-Level)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Mid-Level Azure DevOps Engineer on a 6-month contract in McKinney, TX (hybrid). Requires 3-5 years of Azure DevOps experience, CI/CD pipeline expertise, and familiarity with Microsoft Fabric or Azure data tools. Pay rate: $40–$60/hr.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
April 22, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
McKinney, TX
-
🧠 - Skills detailed
#Synapse #Monitoring #"ETL (Extract #Transform #Load)" #Computer Science #Scala #Cloud #Data Pipeline #Azure #Data Engineering #ADF (Azure Data Factory) #Microsoft Azure #Terraform #Security #Scripting #Bash #Consulting #DevOps #Python #Azure cloud #Automation #Data Processing #Data Architecture #Data Lake #Azure Data Factory #Azure DevOps #Logging #Deployment #GIT #Infrastructure as Code (IaC)
Role description
Azure DevOps Engineer (Microsoft Fabric / Data Platforms)
Department: Information Technology / DevOps / Cloud Engineering
Location: McKinney, TX Hybrid (must be comfortable working onsite as needed)
Role Type: Contract (6-Month Contract to Start, Strong Potential for Extension or Conversion)
•
• USC OR GREEN CARD HOLDER ONLY
•
• About Our Client
Our client is a growing technology consulting firm that partners with enterprise organizations to modernize their cloud and data platforms. They specialize in delivering scalable solutions across the Microsoft ecosystem, helping businesses transition from traditional systems to cloud-based environments. With a strong focus on innovation, automation, and data-driven decision-making, they support clients across multiple industries by building reliable, secure, and high-performing cloud infrastructures.
Job Description
We are seeking a Mid-Level Azure DevOps Engineer to support cloud-based data platform initiatives, specifically within Microsoft Fabric environments (data and analytics platform)
In this role, you will be responsible for building, automating, and maintaining deployment pipelines and cloud environments that support modern data platforms. Your work will directly impact the reliability and scalability of analytics, reporting, and data processing systems used by enterprise clients.
Day-to-day, you will collaborate with data engineers, architects, and project teams to ensure seamless deployments across development, testing, and production environments. This role is critical in enabling repeatable, automated deployment processes while also troubleshooting issues and improving overall platform performance. You will also engage with stakeholders to provide updates and clearly explain technical concepts when needed.
Duties and Responsibilities
• Design, build, and maintain CI/CD pipelines in Azure DevOps
• Support deployments within Microsoft Fabric environments (data pipelines, analytics workspaces, reporting layers)
• Automate infrastructure and environments using Terraform, Bicep, or ARM templates
• Support environment provisioning across development, test, and production environments
• Troubleshoot deployment failures, pipeline issues, and environment inconsistencies
• Collaborate with data engineers and architects to ensure platform readiness
• Implement monitoring, logging, and alerting across cloud environments
• Support data modernization efforts (on-prem to Azure / Fabric)
• Maintain source control, release management, and deployment standards
• Communicate deployment updates and technical concepts to internal teams and stakeholders
Required Experience/Skills
• 3–5 years of experience in Azure DevOps, Cloud Engineering, or DevOps roles
• Strong experience with Azure DevOps and CI/CD pipelines
• Hands-on experience with Microsoft Azure cloud services
• Experience working with data platforms or analytics environments
• Exposure to Microsoft Fabric OR strong experience with Azure data tools (Data Factory, Synapse) in modern data environments
• Experience with Infrastructure as Code (Terraform, Bicep, ARM)
• Scripting experience (PowerShell, Bash, or Python)
• Understanding of cloud networking, security, and access control
• Experience with Git and source control best practices
• Strong troubleshooting and problem-solving skills
• Ability to communicate technical concepts clearly to both technical and non-technical audiences
Nice-to-Haves
• Direct experience with Microsoft Fabric (Lakehouse, analytics workspaces, pipelines, reporting layers)
• Understanding of modern data architecture (ETL/ELT, data lakes, analytics pipelines)
• Experience in consulting or client-facing environments
• Familiarity with Azure monitoring tools (Log Analytics, Application Insights)
• Azure certifications (Administrator, DevOps Engineer, Data Engineer, etc.)
Education
Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field preferred
(Equivalent hands-on experience will also be considered)
Pay & Benefits Summary
• Pay Rate: $40–$60/hr. W2
Call-to-Action
Apply now to be part of a high-impact team driving cloud and data innovation for enterprise clients!
Azure | DevOps | CI/CD | Cloud Engineering | Azure Data Factory | Terraform | Infrastructure as Code | Data Platforms
Azure DevOps Engineer (Microsoft Fabric / Data Platforms)
Department: Information Technology / DevOps / Cloud Engineering
Location: McKinney, TX Hybrid (must be comfortable working onsite as needed)
Role Type: Contract (6-Month Contract to Start, Strong Potential for Extension or Conversion)
•
• USC OR GREEN CARD HOLDER ONLY
•
• About Our Client
Our client is a growing technology consulting firm that partners with enterprise organizations to modernize their cloud and data platforms. They specialize in delivering scalable solutions across the Microsoft ecosystem, helping businesses transition from traditional systems to cloud-based environments. With a strong focus on innovation, automation, and data-driven decision-making, they support clients across multiple industries by building reliable, secure, and high-performing cloud infrastructures.
Job Description
We are seeking a Mid-Level Azure DevOps Engineer to support cloud-based data platform initiatives, specifically within Microsoft Fabric environments (data and analytics platform)
In this role, you will be responsible for building, automating, and maintaining deployment pipelines and cloud environments that support modern data platforms. Your work will directly impact the reliability and scalability of analytics, reporting, and data processing systems used by enterprise clients.
Day-to-day, you will collaborate with data engineers, architects, and project teams to ensure seamless deployments across development, testing, and production environments. This role is critical in enabling repeatable, automated deployment processes while also troubleshooting issues and improving overall platform performance. You will also engage with stakeholders to provide updates and clearly explain technical concepts when needed.
Duties and Responsibilities
• Design, build, and maintain CI/CD pipelines in Azure DevOps
• Support deployments within Microsoft Fabric environments (data pipelines, analytics workspaces, reporting layers)
• Automate infrastructure and environments using Terraform, Bicep, or ARM templates
• Support environment provisioning across development, test, and production environments
• Troubleshoot deployment failures, pipeline issues, and environment inconsistencies
• Collaborate with data engineers and architects to ensure platform readiness
• Implement monitoring, logging, and alerting across cloud environments
• Support data modernization efforts (on-prem to Azure / Fabric)
• Maintain source control, release management, and deployment standards
• Communicate deployment updates and technical concepts to internal teams and stakeholders
Required Experience/Skills
• 3–5 years of experience in Azure DevOps, Cloud Engineering, or DevOps roles
• Strong experience with Azure DevOps and CI/CD pipelines
• Hands-on experience with Microsoft Azure cloud services
• Experience working with data platforms or analytics environments
• Exposure to Microsoft Fabric OR strong experience with Azure data tools (Data Factory, Synapse) in modern data environments
• Experience with Infrastructure as Code (Terraform, Bicep, ARM)
• Scripting experience (PowerShell, Bash, or Python)
• Understanding of cloud networking, security, and access control
• Experience with Git and source control best practices
• Strong troubleshooting and problem-solving skills
• Ability to communicate technical concepts clearly to both technical and non-technical audiences
Nice-to-Haves
• Direct experience with Microsoft Fabric (Lakehouse, analytics workspaces, pipelines, reporting layers)
• Understanding of modern data architecture (ETL/ELT, data lakes, analytics pipelines)
• Experience in consulting or client-facing environments
• Familiarity with Azure monitoring tools (Log Analytics, Application Insights)
• Azure certifications (Administrator, DevOps Engineer, Data Engineer, etc.)
Education
Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field preferred
(Equivalent hands-on experience will also be considered)
Pay & Benefits Summary
• Pay Rate: $40–$60/hr. W2
Call-to-Action
Apply now to be part of a high-impact team driving cloud and data innovation for enterprise clients!
Azure | DevOps | CI/CD | Cloud Engineering | Azure Data Factory | Terraform | Infrastructure as Code | Data Platforms






