

VS2 Technology
DevOps / Data Engineer (Snowflake, DBT, Airflow) - Contract
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps/Data Engineer (Snowflake, DBT, Airflow) on a contract basis, requiring 2–6 years of experience, a pay rate of "unknown," and remote work. Key skills include CI/CD, Python, SQL, and Terraform.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Snowflake #Airflow #Jenkins #dbt (data build tool) #Monitoring #Scala #Kubernetes #Data Engineering #DevOps #Deployment #Automation #Data Pipeline #SQL (Structured Query Language) #GitHub #Infrastructure as Code (IaC) #Docker #Logging #Python #Observability #Cloud #GitLab #"ETL (Extract #Transform #Load)" #Terraform
Role description
About the Role
We are seeking a hands-on DevOps/Data Engineer with strong experience across Snowflake, DBT, and Airflow, combined with solid DevOps skills for automation, CI/CD, and cloud-based infrastructure. This role will support both data engineering workloads and DevOps initiatives, so flexibility and versatility are key.
The ideal candidate understands data pipelines end-to-end — from ingestion and transformation to deployment and observability — and can clearly articulate individual contributions on past projects.
Key Responsibilities
✔ Design, build, and maintain scalable data pipelines using Snowflake, DBT, and Airflow
✔ Develop CI/CD pipelines and automate deployments for data workloads and cloud infrastructure
✔ Contribute to Dimensional Modeling and ELT/ETL transformations
✔ Deploy and manage infrastructure using IaC tools (Terraform is preferred)
✔ Collaborate with data, analytics, and platform teams to support business use cases
✔ Document architecture and ensure monitoring, logging, and alerting are in place
✔ Work across DevOps and Data Engineering functions depending on workload demands
Required Skills & Experience
Must have:
🔹 2–6 years of hands-on experience in DevOps and/or data engineering
🔹 Proven experience with Project examples showing what you built/contributed
🔹 Strong experience with:
• Snowflake
• DBT
• Airflow
• CI/CD pipelines (GitLab, GitHub Actions, Jenkins, etc.)
• Python + SQL
Nice to have:
➕ Terraform (IaC)
➕ Docker / Kubernetes
➕ Experience with performance monitoring and observability
About the Role
We are seeking a hands-on DevOps/Data Engineer with strong experience across Snowflake, DBT, and Airflow, combined with solid DevOps skills for automation, CI/CD, and cloud-based infrastructure. This role will support both data engineering workloads and DevOps initiatives, so flexibility and versatility are key.
The ideal candidate understands data pipelines end-to-end — from ingestion and transformation to deployment and observability — and can clearly articulate individual contributions on past projects.
Key Responsibilities
✔ Design, build, and maintain scalable data pipelines using Snowflake, DBT, and Airflow
✔ Develop CI/CD pipelines and automate deployments for data workloads and cloud infrastructure
✔ Contribute to Dimensional Modeling and ELT/ETL transformations
✔ Deploy and manage infrastructure using IaC tools (Terraform is preferred)
✔ Collaborate with data, analytics, and platform teams to support business use cases
✔ Document architecture and ensure monitoring, logging, and alerting are in place
✔ Work across DevOps and Data Engineering functions depending on workload demands
Required Skills & Experience
Must have:
🔹 2–6 years of hands-on experience in DevOps and/or data engineering
🔹 Proven experience with Project examples showing what you built/contributed
🔹 Strong experience with:
• Snowflake
• DBT
• Airflow
• CI/CD pipelines (GitLab, GitHub Actions, Jenkins, etc.)
• Python + SQL
Nice to have:
➕ Terraform (IaC)
➕ Docker / Kubernetes
➕ Experience with performance monitoring and observability






