

KPI Partners
Senior DataOps Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior DataOps Engineer, 12-month contract, 100% remote (PST hours). Key skills include Snowflake, BigQuery, Apache Airflow, and DBT. Strong experience in data security and governance is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 13, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Architecture #Azure #Apache Airflow #Clustering #SQL (Structured Query Language) #Informatica #GCP (Google Cloud Platform) #Data Security #Data Ingestion #Cloud #Snowflake #AutoScaling #dbt (data build tool) #Fivetran #AWS (Amazon Web Services) #DataOps #Monitoring #Documentation #Databricks #Airflow #Data Pipeline #Data Integration #Scala #BigQuery #AI (Artificial Intelligence) #Data Quality #Data Governance #Security #Consulting #"ETL (Extract #Transform #Load)" #Data Engineering
Role description
Title: Senior DataOps Engineer (Snowflake & BigQuery)
Location: 100% Remote β PST Hours (8 AM β 5 PM PST)
Job Type: Contract β 12 Months
Key Skills: Snowflake, Fivetran, DBT, Apache Airflow
Nice to Have: Data Security, Data Governance
About KPI Partners
KPI Partners is a 5 times Gartner-recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients.
About the Role:
We are seeking a highly technical, hands-on Senior DataOps Engineer responsible for the end-to-end reliability, performance, security, and cost-efficiency of our modern data ecosystem. This role serves as the technical guardian of our Snowflake and BigQuery platforms, ensuring seamless data ingestion, robust orchestration, and optimized transformations at scale.
The ideal candidate is not just a user of data platforms and tools, but a deep technical operator and tunerβsomeone who understands underlying architectures and proactively optimizes for performance, scalability, and cloud cost efficiency.
Key Responsibilities
1. Snowflake & BigQuery Administration
β’ Performance Optimization:
Proactively monitor and tune query performance by identifying bottlenecks and optimizing clustering strategies, search optimization, materialized views, and warehouse configurations.
β’ Security & Governance:
Design, implement, and maintain complex Role-Based Access Control (RBAC) models to enforce data security, governance, and least-privilege access.
β’ Cost Management:
Implement resource monitors, warehouse auto-scaling policies, and consumption tracking to optimize cloud spend and maximize ROI.
1. Data Integration & Pipeline Operations
β’ Data Ingestion:
Provision, manage, and monitor high-volume data ingestion pipelines using Fivetran, Informatica, and Rite Sync, ensuring reliability and data freshness.
β’ Orchestration:
Maintain, monitor, and troubleshoot complex workflows and DAGs in Apache Airflow to ensure timely and dependable data delivery.
β’ Transformations:
Support and optimize DBT models, ensuring efficient transformations, data quality checks, testing, and documentation.
1. Operational Excellence & Reliability
β’ Monitoring & Alerting:
Implement advanced monitoring and alerting for pipeline failures, data latency, SLA breaches, and platform health.
β’ Continuous Improvement:
Regularly assess and fine-tune ingestion, orchestration, and transformation layers to improve reliability, reduce latency, and enhance performance.
β’ Technical Collaboration:
Act as a senior technical peer, contributing to architecture reviews, design discussions, and deep technical problem-solving with cross-functional teams.
Required Qualifications
β’ Strong hands-on experience administering Snowflake and/or BigQuery in production environments
β’ Deep understanding of query optimization, warehouse sizing, and cost controls
β’ Proven experience with Apache Airflow, DBT, and modern ELT/ETL tools
β’ Experience operating and monitoring data ingestion platforms such as Fivetran and Informatica
β’ Strong knowledge of data security, RBAC, and governance models
β’ Ability to troubleshoot complex data pipeline and platform issues end-to-end
Preferred Qualifications
β’ Experience supporting large-scale, high-volume analytical workloads
β’ Familiarity with cloud cost optimization strategies in AWS, GCP, or Azure
β’ Strong SQL expertise and understanding of distributed data architectures
β’ Experience working in highly collaborative, fast-paced data engineering teams
Title: Senior DataOps Engineer (Snowflake & BigQuery)
Location: 100% Remote β PST Hours (8 AM β 5 PM PST)
Job Type: Contract β 12 Months
Key Skills: Snowflake, Fivetran, DBT, Apache Airflow
Nice to Have: Data Security, Data Governance
About KPI Partners
KPI Partners is a 5 times Gartner-recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients.
About the Role:
We are seeking a highly technical, hands-on Senior DataOps Engineer responsible for the end-to-end reliability, performance, security, and cost-efficiency of our modern data ecosystem. This role serves as the technical guardian of our Snowflake and BigQuery platforms, ensuring seamless data ingestion, robust orchestration, and optimized transformations at scale.
The ideal candidate is not just a user of data platforms and tools, but a deep technical operator and tunerβsomeone who understands underlying architectures and proactively optimizes for performance, scalability, and cloud cost efficiency.
Key Responsibilities
1. Snowflake & BigQuery Administration
β’ Performance Optimization:
Proactively monitor and tune query performance by identifying bottlenecks and optimizing clustering strategies, search optimization, materialized views, and warehouse configurations.
β’ Security & Governance:
Design, implement, and maintain complex Role-Based Access Control (RBAC) models to enforce data security, governance, and least-privilege access.
β’ Cost Management:
Implement resource monitors, warehouse auto-scaling policies, and consumption tracking to optimize cloud spend and maximize ROI.
1. Data Integration & Pipeline Operations
β’ Data Ingestion:
Provision, manage, and monitor high-volume data ingestion pipelines using Fivetran, Informatica, and Rite Sync, ensuring reliability and data freshness.
β’ Orchestration:
Maintain, monitor, and troubleshoot complex workflows and DAGs in Apache Airflow to ensure timely and dependable data delivery.
β’ Transformations:
Support and optimize DBT models, ensuring efficient transformations, data quality checks, testing, and documentation.
1. Operational Excellence & Reliability
β’ Monitoring & Alerting:
Implement advanced monitoring and alerting for pipeline failures, data latency, SLA breaches, and platform health.
β’ Continuous Improvement:
Regularly assess and fine-tune ingestion, orchestration, and transformation layers to improve reliability, reduce latency, and enhance performance.
β’ Technical Collaboration:
Act as a senior technical peer, contributing to architecture reviews, design discussions, and deep technical problem-solving with cross-functional teams.
Required Qualifications
β’ Strong hands-on experience administering Snowflake and/or BigQuery in production environments
β’ Deep understanding of query optimization, warehouse sizing, and cost controls
β’ Proven experience with Apache Airflow, DBT, and modern ELT/ETL tools
β’ Experience operating and monitoring data ingestion platforms such as Fivetran and Informatica
β’ Strong knowledge of data security, RBAC, and governance models
β’ Ability to troubleshoot complex data pipeline and platform issues end-to-end
Preferred Qualifications
β’ Experience supporting large-scale, high-volume analytical workloads
β’ Familiarity with cloud cost optimization strategies in AWS, GCP, or Azure
β’ Strong SQL expertise and understanding of distributed data architectures
β’ Experience working in highly collaborative, fast-paced data engineering teams






