

DataOps Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DataOps Engineer, hybrid in Houston, TX or Denver, CO, with a contract length of over 6 months and a pay rate of $110,000 - $125,000/year. Key skills include Python, SQL, cloud platforms, and data governance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
568.1818181818
-
🗓️ - Date discovered
September 4, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Colorado, United States
-
🧠 - Skills detailed
#Data Quality #Docker #Version Control #Continuous Deployment #Data Architecture #MDM (Master Data Management) #Python #SQL (Structured Query Language) #Apache Airflow #AWS (Amazon Web Services) #Automation #Compliance #Data Management #DataOps #Computer Science #GIT #ML (Machine Learning) #Data Warehouse #Quality Assurance #Scala #DevOps #Agile #Monitoring #Boomi #GCP (Google Cloud Platform) #Security #Azure #ML Ops (Machine Learning Operations) #Programming #Data Integrity #Databases #SSIS (SQL Server Integration Services) #Data Engineering #Data Processing #Deployment #Data Integration #Infrastructure as Code (IaC) #Data Privacy #Airflow #Cloud #Documentation #"ETL (Extract #Transform #Load)" #Logging #Project Management #Luigi #Process Automation #Data Governance #C# #Kubernetes #Data Pipeline #Terraform #Data Access #Java
Role description
Title: DataOp s Engineer
Segment: Voyager - Corporate
Location: Houston, TX or Denver, CO
Job Type: Full Time - Hybrid
Relocation Eligible: No
Company Description: Voyager is an innovative defense, national security and space technology company committed to advancing and delivering transformative, mission-critical solutions. We tackle the most complex challenges to unlock new frontiers for human progress, fortify national security, and protect critical assets to lead in the race for technological and operational superiority from ground to space.
Position Description
A DataOps Engineer stands at the intersection of data engineering, operations, and process automation, playing a vital role in enabling an organization’s data-driven decision-making. This role is designed for those who thrive on both the technical and process-oriented challenges of modern data ecosystems, ensuring seamless data flows, robust infrastructure, and reliable analytics pipelines. The DataOps Engineer enables teams to collaborate efficiently, delivers continuous improvements to data operations, and champions innovation in management and utilization of data resources.
Key Responsibilities
• Design and Implement Data Pipelines:
• Architect, develop, and maintain automated data pipelines using modern technologies (such as Apache Airflow, Luigi, SSIS, Boomi or similar orchestration tools). Ensure data is ingested, transformed, and delivered to downstream systems efficiently and reliably.
• Infrastructure Management:
• Build, monitor, and optimize cloud-based and on-premises data infrastructure. Collaborate with IT and DevOps teams to provision scalable resources, implement monitoring solutions, and automate deployment processes.
• Data Quality Assurance:
• Develop and enforce data validation, profiling, and monitoring tools to ensure data integrity, accuracy, and completeness across all stages of the pipeline.
• Continuous Integration / Continuous Deployment (CI/CD):
• Integrate CI/CD best practices into data workflows, automating testing, deployment, and rollback mechanisms for data models, code, and infrastructure.
• Collaboration & Communication:
• Work closely with data engineers, analysts, scientists, and business stakeholders to gather requirements, troubleshoot issues, and deliver data solutions tailored to organizational goals.
• Monitoring and Incident Response:
• Implement comprehensive monitoring, logging, and alerting systems for data operations. Respond to incidents, root-cause analysis, and take corrective actions to minimize downtime and data loss.
• Governance and Security:
• Enforce data governance policies, manage data access controls, and ensure compliance with relevant data privacy regulations such as CMMC, NIST 800-171, or others as required.
• Documentation & Knowledge Sharing:
• Maintain thorough technical documentation of data architectures, operational procedures, and troubleshooting guides. Promote knowledge sharing via wikis, workshops, and training sessions.
• Performance Optimization:
• Analyze pipeline and infrastructure performance, identify bottlenecks, and implement optimizations to reduce latency, improve scalability, and enhance user experience.
• Innovation and Automation:
• Continuously evaluate emerging tools, frameworks, and practices in the realms of data engineering and operations. Drive adoption of automation, machine learning operations, and self-healing systems.
• Application Management
• Assist in maintaining data associated with all applications utilized within the organization.
Required Skills and Qualifications:
• Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or related field; or, minimum of ten (10) years’ experience in business IT data management.
• Strong programming skills in Python, Java, C# or Scala; proficiency in SQL and relational databases.
• Experience with orchestration tools (e.g., Apache Airflow, Prefect, Luigi).
• Knowledge of cloud platforms (AWS, Azure, GCP) and data infrastructure components (data warehouses, lakes, streaming platforms).
• Familiarity with DevOps practices, including containerization (Docker, Kubernetes), CI/CD pipelines, and infrastructure as code (Terraform, CloudFormation).
• Understanding of data quality principles, governance, and security frameworks.
• Strong problem-solving abilities, attention to detail, and capacity to troubleshoot complex data issues.
• The ability to manage several projects or tasks simultaneously.
• Excellent communication skills and ability to work collaboratively in cross-functional teams.
• Knowledge of agile methodologies, project management tools, and version control systems (Git).
Preferred Qualifications:
• Advanced degree or certifications in data engineering, cloud architecture, or DevOps.
• Experience (5 years minimum) with relational and tabular data management.
• Master Data Management experience (5 years minimum).
• Experience with real-time data processing tools.
• Experience with systems data integrations.
• Background in machine learning operations (MLOps) and experience deploying models to production.
• Experience implementing data governance frameworks in regulated industries.
Personal Attributes:
• Curiosity and a passion for continuous learning and professional growth.
• Resourcefulness, adaptability, and resilience in the face of evolving challenges.
• Dedication to quality, excellence, and proactive process improvement.
• Commitment to ethical business practices.
• Ability to communicate complex concepts clearly to technical and non-technical stakeholders.
Salary Range: $110,000 - $125,000/year
Please click “Apply” to submit your application.
Voyager offers a comprehensive, total compensation package, which includes competitive salary, a discretionary annual bonus plan, paid time off (PTO), a comprehensive health benefit package, retirement savings, wellness program, and various other benefits. When you join our team, you’re not just an employee; you become part of a dynamic community dedicated to innovation and excellence.
To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State.
Voyager is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
Minority/Female/Disabled/Veteran
The statements contained in this job description are intended to describe the general content and requirements for performance of this job. It is not intended to be an exhaustive list of all job duties, responsibilities, and requirements. This job description is not an employment agreement or contract. Management has the exclusive right to alter the scope of work within the framework of this job description at any time without prior notice.
Title: DataOp s Engineer
Segment: Voyager - Corporate
Location: Houston, TX or Denver, CO
Job Type: Full Time - Hybrid
Relocation Eligible: No
Company Description: Voyager is an innovative defense, national security and space technology company committed to advancing and delivering transformative, mission-critical solutions. We tackle the most complex challenges to unlock new frontiers for human progress, fortify national security, and protect critical assets to lead in the race for technological and operational superiority from ground to space.
Position Description
A DataOps Engineer stands at the intersection of data engineering, operations, and process automation, playing a vital role in enabling an organization’s data-driven decision-making. This role is designed for those who thrive on both the technical and process-oriented challenges of modern data ecosystems, ensuring seamless data flows, robust infrastructure, and reliable analytics pipelines. The DataOps Engineer enables teams to collaborate efficiently, delivers continuous improvements to data operations, and champions innovation in management and utilization of data resources.
Key Responsibilities
• Design and Implement Data Pipelines:
• Architect, develop, and maintain automated data pipelines using modern technologies (such as Apache Airflow, Luigi, SSIS, Boomi or similar orchestration tools). Ensure data is ingested, transformed, and delivered to downstream systems efficiently and reliably.
• Infrastructure Management:
• Build, monitor, and optimize cloud-based and on-premises data infrastructure. Collaborate with IT and DevOps teams to provision scalable resources, implement monitoring solutions, and automate deployment processes.
• Data Quality Assurance:
• Develop and enforce data validation, profiling, and monitoring tools to ensure data integrity, accuracy, and completeness across all stages of the pipeline.
• Continuous Integration / Continuous Deployment (CI/CD):
• Integrate CI/CD best practices into data workflows, automating testing, deployment, and rollback mechanisms for data models, code, and infrastructure.
• Collaboration & Communication:
• Work closely with data engineers, analysts, scientists, and business stakeholders to gather requirements, troubleshoot issues, and deliver data solutions tailored to organizational goals.
• Monitoring and Incident Response:
• Implement comprehensive monitoring, logging, and alerting systems for data operations. Respond to incidents, root-cause analysis, and take corrective actions to minimize downtime and data loss.
• Governance and Security:
• Enforce data governance policies, manage data access controls, and ensure compliance with relevant data privacy regulations such as CMMC, NIST 800-171, or others as required.
• Documentation & Knowledge Sharing:
• Maintain thorough technical documentation of data architectures, operational procedures, and troubleshooting guides. Promote knowledge sharing via wikis, workshops, and training sessions.
• Performance Optimization:
• Analyze pipeline and infrastructure performance, identify bottlenecks, and implement optimizations to reduce latency, improve scalability, and enhance user experience.
• Innovation and Automation:
• Continuously evaluate emerging tools, frameworks, and practices in the realms of data engineering and operations. Drive adoption of automation, machine learning operations, and self-healing systems.
• Application Management
• Assist in maintaining data associated with all applications utilized within the organization.
Required Skills and Qualifications:
• Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or related field; or, minimum of ten (10) years’ experience in business IT data management.
• Strong programming skills in Python, Java, C# or Scala; proficiency in SQL and relational databases.
• Experience with orchestration tools (e.g., Apache Airflow, Prefect, Luigi).
• Knowledge of cloud platforms (AWS, Azure, GCP) and data infrastructure components (data warehouses, lakes, streaming platforms).
• Familiarity with DevOps practices, including containerization (Docker, Kubernetes), CI/CD pipelines, and infrastructure as code (Terraform, CloudFormation).
• Understanding of data quality principles, governance, and security frameworks.
• Strong problem-solving abilities, attention to detail, and capacity to troubleshoot complex data issues.
• The ability to manage several projects or tasks simultaneously.
• Excellent communication skills and ability to work collaboratively in cross-functional teams.
• Knowledge of agile methodologies, project management tools, and version control systems (Git).
Preferred Qualifications:
• Advanced degree or certifications in data engineering, cloud architecture, or DevOps.
• Experience (5 years minimum) with relational and tabular data management.
• Master Data Management experience (5 years minimum).
• Experience with real-time data processing tools.
• Experience with systems data integrations.
• Background in machine learning operations (MLOps) and experience deploying models to production.
• Experience implementing data governance frameworks in regulated industries.
Personal Attributes:
• Curiosity and a passion for continuous learning and professional growth.
• Resourcefulness, adaptability, and resilience in the face of evolving challenges.
• Dedication to quality, excellence, and proactive process improvement.
• Commitment to ethical business practices.
• Ability to communicate complex concepts clearly to technical and non-technical stakeholders.
Salary Range: $110,000 - $125,000/year
Please click “Apply” to submit your application.
Voyager offers a comprehensive, total compensation package, which includes competitive salary, a discretionary annual bonus plan, paid time off (PTO), a comprehensive health benefit package, retirement savings, wellness program, and various other benefits. When you join our team, you’re not just an employee; you become part of a dynamic community dedicated to innovation and excellence.
To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State.
Voyager is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
Minority/Female/Disabled/Veteran
The statements contained in this job description are intended to describe the general content and requirements for performance of this job. It is not intended to be an exhaustive list of all job duties, responsibilities, and requirements. This job description is not an employment agreement or contract. Management has the exclusive right to alter the scope of work within the framework of this job description at any time without prior notice.