Comtech Global, Inc

DataOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DataOps Engineer, 100% remote, with a contract length of "unknown." The pay rate is "unknown." Key skills include Azure DevOps, Databricks, AWS Data Lake, and Power BI. Experience in healthcare data systems is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 2, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Morrisville, VT
-
🧠 - Skills detailed
#Metadata #Security #Data Catalog #Data Engineering #Data Integration #Base #Anomaly Detection #Lean #Monitoring #Azure Databricks #Data Pipeline #Microsoft Power BI #Documentation #Data Management #Compliance #Scrum #Data Lineage #Automation #Data Governance #Data Quality #Azure DevOps #BI (Business Intelligence) #AWS (Amazon Web Services) #Data Dictionary #Azure #Semantic Models #Data Accuracy #Databricks #DevOps #Data Analysis #DataOps #Power Automate #Data Lake #Data Ingestion #"ETL (Extract #Transform #Load)" #Agile #Data Lifecycle
Role description
Direct client - DataOps Engineer, 100% REMOTE The DataOps Engineer will support with our direct client Rural Health Transformation Project by building and operationalizing a modern data infrastructure. The role focuses on improving data quality, governance, analytics, and pipeline reliability across enterprise health data systems. The engineer will work closely with data teams, governance stakeholders, and vendors to design and implement DataOps processes, ensure data accuracy, and enable business intelligence reporting through modern tools and Agile methodologies. Interview Process β€’ Oral Presentations / Interviews are required β€’ Candidates must be available to participate in client evaluation sessions Key Responsibilities β€’ Design and implement DataOps processes and workflows β€’ Develop data quality rules, validation frameworks, and monitoring systems β€’ Build and maintain data pipelines and ingestion specifications β€’ Implement statistical process control (SPC) and anomaly detection models β€’ Develop and manage metadata, lineage, and data governance frameworks β€’ Create and maintain BI dashboards and semantic data models β€’ Monitor data reliability, performance, and SLAs β€’ Use Azure DevOps for workflow management and tracking β€’ Collaborate with teams to translate business needs into technical solutions β€’ Develop documentation, SOPs, and knowledge base (Wikis) β€’ Conduct training and enablement sessions for staff β€’ Ensure compliance with data governance and security standards Key Skills Technical Skills β€’ Strong experience in DataOps / Data Engineering β€’ Expertise in Azure DevOps (Boards, Pipelines, Repos) β€’ Experience with Azure Databricks and AWS Data Lake β€’ Proficiency in Power BI (dashboards, semantic models) β€’ Knowledge of ETL/ELT pipelines and data ingestion frameworks β€’ Experience with data quality, validation, and monitoring tools β€’ Understanding of APIs and data integrations β€’ Familiarity with automation tools (Power Automate, pipelines) Data & Analytics Skills β€’ Data quality management and validation frameworks β€’ Statistical Process Control (SPC) and anomaly detection β€’ Data governance, metadata management, and lineage tracking β€’ BI reporting and analytics modeling β€’ Data cataloging and data dictionary management Functional Skills β€’ Strong understanding of data lifecycle and data governance β€’ Experience in Agile/Scrum environments β€’ Ability to translate business requirements into data solutions β€’ Knowledge of healthcare data systems (preferred) Certifications (Preferred) β€’ Azure Data Engineer Associate β€’ Power BI Data Analyst (PL-300) β€’ Databricks Certification β€’ Data Management (DAMA CDMP / DCAM) β€’ Agile / Scrum certifications (CSM, SAFe) β€’ Lean Six Sigma β€’ Security certifications (Security+, ISCΒ²) Soft Skills β€’ Strong communication and collaboration β€’ Analytical and problem-solving skills β€’ Attention to detail β€’ Ability to lead cross-team coordination β€’ Strong documentation and knowledge-sharing ability Simple Summary (for quick submission) β€’ Role: DataOps Engineer β€’ Focus: Data pipelines, governance, analytics, and quality β€’ Tech Stack: Azure DevOps, Databricks, AWS Data Lake, Power BI β€’ Key Strengths: Data quality, automation, analytics, and Agile delivery