My3Tech

DataOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DataOps Engineer with a 12+ month contract, remote in Montpelier, VT, offering competitive pay. Key skills include Azure Data Engineering, Power BI, and Data Governance. Certifications like Microsoft Certified: Azure Data Engineer Associate are preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 6, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Microsoft Power BI #Monitoring #Data Quality #AI (Artificial Intelligence) #Alation #BI (Business Intelligence) #Anomaly Detection #Compliance #Azure #Scala #Databricks #Data Dictionary #DAX #Logging #Classification #Data Governance #Semantic Models #Data Engineering #Azure DevOps #Documentation #Base #Visualization #DevOps #Data Management #Scrum #Observability #Data Analysis #Security #Lean #DataOps #Metadata
Role description
Job Title: DataOps Engineer Job Location: Montpelier, VT (Remote) Duration: 12+ Months Contract with possibility to extension Job Description: The DataOps Engineer / Data Engineer will support the client’s DataOps, data quality, analytics, and governance modernization efforts. The resource will provide technical, analytical, facilitative, and governance support to help operationalize sustainable, staff-owned DataOps capabilities. Responsibilities include designing and implementing downstream data quality engineering processes, establishing statistical process control and anomaly detection practices, defining metadata and lineage standards, developing governed BI semantic models and dashboards, and creating standard operating procedures, playbooks, and Azure DevOps Wikis. The resource will lead and support hands-on co-development sessions with State staff to build internal knowledge and long-term operational ownership. The role will translate downstream data and analytics requirements into actionable ingestion pipeline specifications for the separate modernization team, maintain operational performance metrics and observability dashboards, and support governance workflows through Azure DevOps-based iterative delivery cycles. The assigned personnel must demonstrate strong technical skills, reliability, professionalism, and the ability to work collaboratively with State teams. The resource will be expected to complete assigned work in alignment with project objectives, State-approved standards, and direction from the State Director of Health Systems Data & Analytics. All work must be performed on time, within scope, within budget, and in accordance with State-defined quality standards. The resource must maintain professional conduct and may be subject to replacement if performance, conduct, or collaboration expectations are not met. Preferred Qualifications: β€’ Microsoft Certified: Azure Data Engineer Associate β€’ Microsoft Certified: Power BI Data Analyst Associate (PL-300) β€’ Databricks Data Engineer Associate (or Professional) β€’ Azure AI Engineer Associate β€’ DAMA CDMP (Certified Data Management Professional) β€’ DCAM (EDM Council Data Management Certification) β€’ IIBA CCBA or CBAP β€’ PMI-PBA β€’ DAMA CDMP or DCAM β€’ Lean Six Sigma β€’ Prosci Change Practitioner (for governance + adoption work) β€’ SAFe Practitioner (SP) or Scrum Master (PSM I / CSM) β€’ Azure Fundamentals (AZ-900) β€’ Security+ or ISCΒ² CC (for data governance and disclosure related work) State Roles and Responsibilities: State Personnel Director of Data and Analytics will: β€’ Provide an introductory process β€’ Assign work β€’ Set the schedule β€’ Provide high-level oversight β€’ Provide performance feedback β€’ Establish a deliverable review process β€’ Approve deliverables submitted for work performed β€’ Establish invoice process β€’ Approve invoices submitted for work performed β€’ Provide necessary resources β€’ Resolve issues Key Deliverables: The selected resource will be responsible for supporting, developing, documenting, and transferring ownership of the following deliverables: β€’ Data Quality Rule Library, including governed and version-controlled downstream data quality rules for completeness, validity, conformity, thresholds, exception logic, steward ownership, business term links, and lineage references. β€’ Data Quality Validation Framework, including validation logic, runbooks, troubleshooting guidance, pipeline output requirements, manifests, record counts, and schema metadata expectations. β€’ Data Quality Incident Logging and Triage SOP, including severity classification, root-cause analysis templates, evidence capture, routing procedures, and ingestion-related issue tracking. β€’ Quarterly Quality Performance Summary, including issue trends, quality heat maps, data quality insights, improvement recommendations, and updates to ingestion pipeline requirements. β€’ Annual Quality Comparison, including comparison of analysis-ready data against external data sources and validation metrics for key data assets. β€’ Statistical Process Control Monitoring Framework, including control charts, thresholds, steward notes, metadata requirements, and monitoring standards for volumes, timeliness, and distributions. β€’ AI-Assisted Anomaly Detection Framework, including drift and outlier detection models, model cards, metadata/logging requirements, retraining schedules, and quarterly threshold tuning. β€’ SPC and Anomaly Event Documentation, including root-cause analysis entries, communication templates, lineage impact, BI impact, and ingestion/vendor action notes. β€’ Threshold and Alerting Standards, including alert tiers, service-level expectations, escalation paths, reliability KPIs, and scenario validation. β€’ Data Glossary, Catalog, and Lineage Documentation, including glossary terms, lineage records, steward roles, metadata standards, and upstream lineage requirements. β€’ Disclosure and Suppression Rules, including privacy requirements, BI suppression logic, test cases, exceptions workflow, and dashboard validation requirements. β€’ Governance Operating Procedures and Definition of Done, including SOPs for lineage updates, row-level security checks, disclosure checks, glossary maintenance, governance checklists, and audit readiness. β€’ Data Contract Requirements for Ingestion Vendor, including schema expectations, metadata fields, QA thresholds, manifests, hash totals, and schema versioning requirements. β€’ Governed Semantic Models for cost, quality, and access analytics, including DAX logic, measure definitions, row-level security, disclosure compliance, steward ownership, and lineage. β€’ Standardized Dashboards, including internal dashboards with consistent design standards, data quality overlays, SPC indicators, KPI documentation, data sources, and suppression behavior. β€’ BI Refresh and Validation Playbooks, including pre-refresh and post-refresh checks, validation logic, evidence capture, troubleshooting steps, and reliability monitoring. β€’ Data Dictionary and Visualization Standards, including metric definitions, data ownership, upstream lineage, accessibility rules, and visual design standards. β€’ Reliability and SLA Dashboards, including reporting for reliability percentage, mean time to resolution, incidents, governance throughput, BI adoption, and operational performance. β€’ Operational Metrics Package, including DataOps KPI definitions, formulas, thresholds, reporting cadence, and steward commentary. β€’ End-to-End Lineage Visualization, including diagrams and documentation showing data flow from ingestion through curated data, data quality/SPC processes, and BI reporting. β€’ Quarterly DataOps Trend Report, including quality trends, SPC findings, reliability trends, governance compliance, BI adoption, improvement actions, and steward approvals. β€’ Hands-on Role-Based Training Materials, including labs, recordings, facilitation guides, ADO workflow guidance, and staff enablement materials. β€’ Knowledge Base of SOPs and Playbooks, including version-controlled runbooks, audit guides, ownership assignments, review schedules, and maintenance procedures. β€’ Cross-Team Collaboration Procedures, including RACI documentation, communication templates, handoff procedures, routing workflows, and coordination with ingestion modernization teams. β€’ Change Management and Sustainment Materials, including readiness assessments, adoption dashboards, communications, sustainment roadmap, and support for long-term staff independence. The resource must ensure that deliverables are created as living documentation, preferably through Azure DevOps Wikis and/or Azure DevOps work items, and developed through hands-on co-development with State staff to promote long-term ownership and operational sustainability.