ADN Group

DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer in Boston, MA, for 12 months at a pay rate of "unknown." Requires 3-7+ years in DevOps, expertise in Snowflake and Apache Airflow, CI/CD, Terraform, and cloud platforms (AWS, Azure, GCP).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 10, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Data Accuracy #SQL Server #DevOps #Infrastructure as Code (IaC) #Apache Airflow #Data Warehouse #IICS (Informatica Intelligent Cloud Services) #Shell Scripting #Cloud #Kubernetes #Computer Science #Python #DevSecOps #Firewalls #Security #Data Pipeline #Metadata #Logging #Compliance #Terraform #Presto #Monitoring #Data Quality #Azure #Data Engineering #Deployment #Snowflake #Documentation #GitLab #Airflow #GitHub #Scripting #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Informatica #AWS (Amazon Web Services) #Automation #Azure DevOps #Docker #GCP (Google Cloud Platform) #Jenkins
Role description
Job Title: DevOps Engineer Location: Boston, MA Work Type: Onsite Role Duration: 12 Months (Temporary) Work Schedule: Monday-Friday, 9:00 AM - 5:00 PM EST Primary Skills: DevOps Engineering Snowflake (Roles, Warehouses, Performance Tuning, Cost Control, RBAC) Informatica Intelligent Cloud Services (IICS) Apache Airflow (DAG Orchestration, Monitoring, Deployment) CI/CD Pipelines (GitLab, GitHub Actions, Azure DevOps, Jenkins) Terraform (Infrastructure as Code - IaC) Cloud Platforms: AWS, Azure, GCP Docker & Kubernetes Python & Shell Scripting DevSecOps, Security, Monitoring & Automation Position Overview The client is seeking an experienced DevOps Engineer to support a cloud data warehouse modernization initiative, transitioning from a SQL Server/AWS-based system to a Snowflake-based data platform. This role is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The DevOps Engineer will design and implement CI/CD pipelines, automate data pipeline deployments, and ensure high availability, security, and operational reliability across Snowflake, Informatica (IICS), and Apache Airflow environments. Job Duties & Responsibilities Build and maintain CI/CD pipelines for Snowflake, Informatica (IICS), and Apache Airflow DAG deployments Implement automated code promotion across development, test, and production environments Integrate testing, linting, and security scanning into deployment workflows Develop and manage Infrastructure as Code (IaC) using Terraform or similar tools Ensure configuration and environment consistency across multi-region and multi-cloud setups Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls) Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance Build proactive monitoring dashboards for job failures, data quality checks, and warehouse usage Optimize Snowflake pipeline performance, concurrency, and cost governance Own deployment frameworks for ETL/ELT code, SQL scripts, and metadata updates Support user access provisioning and RBAC alignment across Snowflake, Informatica, and Airflow Troubleshoot platform and orchestration issues and lead incident response during outages Enforce DevSecOps best practices including encryption, secrets management, and key rotation Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements Participate in testing, deployment, and release management for new data workflows and enhancements Required Skills, Experience, Qualifications & Abilities Bachelor’s degree or equivalent experience in Computer Science, Information Systems, Data Engineering, Health Informatics, or a related field 3-7+ years of experience in DevOps, Cloud Engineering, or Data Platform Engineering Strong hands-on experience with Snowflake (roles, warehouses, performance optimization, cost governance) Expertise in Apache Airflow including DAG orchestration, monitoring, and deployments Experience with Informatica (IICS) pipeline deployment automation Strong CI/CD experience using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar tools Proficiency in Terraform, Python, and Shell scripting Deep understanding of cloud platforms such as AWS, Azure, or GCP Experience with containerization technologies (Docker, Kubernetes), particularly for Airflow workloads Solid knowledge of networking concepts and security controls Ability to troubleshoot complex distributed data workloads Strong communication skills for collaboration with both technical and non-technical stakeholders Excellent documentation and cross-team collaboration skills Proactive mindset with a focus on automation and continuous process improvement Detail-oriented approach with emphasis on data accuracy and operational excellence Preferred Skills, Experience & Qualifications Experience migrating from SQL Server or other legacy data warehouse platforms Knowledge of FinOps practices for Snowflake cost and usage optimization Background in healthcare, finance, or other regulated industries is a plus Arushi Khanna | Associate - Hiring & Recruitment Email: arushi@nsitsolutions.com & Vishal (Victor) Verma | Assistant Manager vishal@nsitsolutions.com NS IT Solutions www.nsitsolutions.com