

VS2 Technology
DevOps / Data Engineer (Snowflake, DBT, Airflow) - Contract - Hybrid
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps/Data Engineer (Snowflake, DBT, Airflow) on a hybrid contract for 6 months, offering a competitive pay rate. Requires 2–6 years of experience in DevOps/data engineering, strong skills in Snowflake, DBT, and Airflow.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Observability #GitHub #Monitoring #Deployment #Logging #dbt (data build tool) #Snowflake #SQL (Structured Query Language) #Cloud #Python #Infrastructure as Code (IaC) #DevOps #Jenkins #Kubernetes #Scala #Terraform #GitLab #Data Pipeline #Airflow #"ETL (Extract #Transform #Load)" #Automation #Data Engineering #Docker
Role description
About the Role
We are seeking a hands-on DevOps/Data Engineer with strong experience across Snowflake, DBT, and Airflow, combined with solid DevOps skills for automation, CI/CD, and cloud-based infrastructure. This role will support both data engineering workloads and DevOps initiatives, so flexibility and versatility are key.
The ideal candidate understands data pipelines end-to-end — from ingestion and transformation to deployment and observability — and can clearly articulate individual contributions on past projects.
Key Responsibilities
✔ Design, build, and maintain scalable data pipelines using Snowflake, DBT, and Airflow
✔ Develop CI/CD pipelines and automate deployments for data workloads and cloud infrastructure
✔ Contribute to Dimensional Modeling and ELT/ETL transformations
✔ Deploy and manage infrastructure using IaC tools (Terraform is preferred)
✔ Collaborate with data, analytics, and platform teams to support business use cases
✔ Document architecture and ensure monitoring, logging, and alerting are in place
✔ Work across DevOps and Data Engineering functions depending on workload demands
Required Skills & Experience
Must have:
🔹 2–6 years of hands-on experience in DevOps and/or data engineering
🔹 Proven experience with Project examples showing what you built/contributed
🔹 Strong experience with:
• Snowflake
• DBT
• Airflow
• CI/CD pipelines (GitLab, GitHub Actions, Jenkins, etc.)
• Python + SQL
Nice to have:
➕ Terraform (IaC)
➕ Docker / Kubernetes
➕ Experience with performance monitoring and observability
About the Role
We are seeking a hands-on DevOps/Data Engineer with strong experience across Snowflake, DBT, and Airflow, combined with solid DevOps skills for automation, CI/CD, and cloud-based infrastructure. This role will support both data engineering workloads and DevOps initiatives, so flexibility and versatility are key.
The ideal candidate understands data pipelines end-to-end — from ingestion and transformation to deployment and observability — and can clearly articulate individual contributions on past projects.
Key Responsibilities
✔ Design, build, and maintain scalable data pipelines using Snowflake, DBT, and Airflow
✔ Develop CI/CD pipelines and automate deployments for data workloads and cloud infrastructure
✔ Contribute to Dimensional Modeling and ELT/ETL transformations
✔ Deploy and manage infrastructure using IaC tools (Terraform is preferred)
✔ Collaborate with data, analytics, and platform teams to support business use cases
✔ Document architecture and ensure monitoring, logging, and alerting are in place
✔ Work across DevOps and Data Engineering functions depending on workload demands
Required Skills & Experience
Must have:
🔹 2–6 years of hands-on experience in DevOps and/or data engineering
🔹 Proven experience with Project examples showing what you built/contributed
🔹 Strong experience with:
• Snowflake
• DBT
• Airflow
• CI/CD pipelines (GitLab, GitHub Actions, Jenkins, etc.)
• Python + SQL
Nice to have:
➕ Terraform (IaC)
➕ Docker / Kubernetes
➕ Experience with performance monitoring and observability






