Synergyassure Inc

DevOps Enginee

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer in Boston, MA, lasting 12+ months at a pay rate of "unknown." Key skills include 3–7+ years in DevOps, expertise in Snowflake, Apache Airflow, and CI/CD tools, with a preferred background in healthcare or finance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
January 8, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Azure #Firewalls #"ETL (Extract #Transform #Load)" #DevOps #Data Warehouse #SQL Server #Apache Airflow #Cloud #Triggers #DevSecOps #Jenkins #Terraform #Python #Data Quality #Snowflake #Azure DevOps #SQL (Structured Query Language) #Data Engineering #GCP (Google Cloud Platform) #Data Pipeline #Shell Scripting #Scripting #Airflow #Computer Science #Informatica #Automation #Docker #Data Accuracy #Presto #GitHub #Kubernetes #Monitoring #Security #GitLab #Logging #Infrastructure as Code (IaC) #AWS (Amazon Web Services) #IICS (Informatica Intelligent Cloud Services) #Documentation #Compliance #Deployment #Metadata
Role description
Hi All, Hope everything going well, Kindly share suitable profile to sandhya@synergyassure.com or reach me +1-726-229-1448 Please share resume for the opportunity below. Local candidate in MA only. Client: EOHHS Req ID: : ITS77-EHS-FY26-DATA ENGINEER-001 Role: DevOps Engineer Duration: 12+ Months Location: 40 Broad Street, Boston, MA 02109 Work Schedule: Onsite Interview type: Initial contact by phone, interview may be in person or video conference • Only qualified DevOps Engineer candidates located in the Boston, MA area will be considered due to the position requiring an on-site presence • • • Required Education: • Bachelor’s degree or equivalent years in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field Required Skills, Experience, Qualifications & Abilities: • 3–7+ years in DevOps, Cloud Engineering, or Data Platform Engineering roles • Snowflake (roles, warehouses, performance tuning, cost control) • Apache Airflow (DAG orchestration, monitoring, deployments) • Informatica (IICS pipeline deployment automation preferred) • Strong CI/CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar • Proficiency with Terraform, Python, and Shell scripting • Deep understanding of cloud platforms: AWS, Azure, or GCP • Experience with containerization (Docker, Kubernetes), especially for Airflow • Strong knowledge of networking concepts and security controls • Effective communication with technical and non-technical stakeholders • Ability to troubleshoot complex distributed data workloads • Strong documentation and cross-team collaboration skills • Proactive and committed to process improvement and automation • Detail-oriented, with a focus on data accuracy and process improvement Preferred Skills, Experience, Qualifications & Abilities: • Experience migrating from SQL Server or other legacy DW platforms • Knowledge of FinOps practices for Snowflake usage optimization • Background in healthcare, finance, or regulated industries a plus Client is seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative, migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The DevOps Engineer will design and implement CI/CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments. Job duties and Responsibilities: • Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake, Informatica (IICS), and Airflow DAG (Directed Acyclic Graph) deployments • Implement automated code promotion between development, test, and production environments • Integrate testing, linting, and security scanning into deployment processes • Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects, network, and cloud resources • Manage configuration and environment consistency across multi-region/multi-cloud setups • Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls) • Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance • Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage • Optimize pipeline performance, concurrency, and cost governance in Snowflake • Own deployment frameworks for ETL/ELT code, SQL scripts, metadata updates • Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow • Troubleshoot platform and orchestration issues, lead incident response during outages • Enforce DevSecOps practices including encryption, secrets management, and key rotation • Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements • Participate in testing, deployment, and release management for new data workflows and enhancements.