

TalentOla
Sr. DevOps Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. DevOps Engineer in Boston, MA, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Terraform, GitHub Actions, AWS, and big data platforms. A Bachelor’s degree and 10 years of DevOps experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #VPC (Virtual Private Cloud) #GIT #GitHub #Infrastructure as Code (IaC) #Redshift #Big Data #DevOps #Programming #ChatGPT #Computer Science #Terraform #Data Engineering #Kubernetes #Apache Iceberg #Cloud #Containers #Scripting #Lambda (AWS Lambda) #Docker #Automation #Version Control #Spark (Apache Spark) #Python #AI (Artificial Intelligence) #Bash #RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #Amazon ECS (Amazon Elastic Container Service) #Data Analysis #AWS Glue #Apache Spark #EC2 #Snowflake #Amazon Redshift #Deployment
Role description
Sr. DevOps Engineer
Boston, MA (4 days a week onsite)
Description: The Data Platform Engineering team supports CI/CD and infrastructure for our integrated Data Fabric platform. This Data Fabric is a strategic core asset underpinning the operational success of our Firm. The Data Platform Engineer will design and implement efficient procedures and pipelines for software development and infrastructure deployment, manage and deploy various key data systems and services. The Data Platform Engineer will work with cloud engineers, data engineers, system administrators, data administrators and architects to find opportunities to leverage DevOps technologies to process large volumes of data. The Data Platform Engineer will implement CI/CD workflows for Infrastructure as Code (IaC) and automated deployments. This role requires a motivated individual with strong technical ability, data capability, excellent communication, and collaboration skills including the ability to develop and troubleshoot a diverse range of problems.
Responsibilities:
• Implement continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) for data infrastructure using Terraform and GitHub Actions.
• Implement and deploy GitHub Actions tools and workflows.
• Implement various development, testing, automation, and data infrastructure tools
QUALIFICATIONS:
• Bachelor’s degree in computer science or engineering
• Minimum of 10 years of experience in DevOps engineering
• Git version control
• Minimum of 5 years of experience with GitHub Actions
• Minimum of 5 years of experience with Terraform
• Minimum of 5 years of AWS Cloud experience that includes: Lambda, EC2, and VPC
• Experience with big data platforms such as Amazon Redshift, Snowflake, Apache Iceberg, Apache Spark, AWS Glue, RDS
• Experience with containers: Docker, Amazon ECS, and/or Kubernetes
• Experience with data engineering, data analysis, and/or ETL
• Experience with programming languages such as Python
• Experience with OS-level scripting languages such as Bash and PowerShell
• Experience with use of advanced features of AI tools: ChatGPT, custom GPTs, and/or other models such as Claude Sonnet
Sr. DevOps Engineer
Boston, MA (4 days a week onsite)
Description: The Data Platform Engineering team supports CI/CD and infrastructure for our integrated Data Fabric platform. This Data Fabric is a strategic core asset underpinning the operational success of our Firm. The Data Platform Engineer will design and implement efficient procedures and pipelines for software development and infrastructure deployment, manage and deploy various key data systems and services. The Data Platform Engineer will work with cloud engineers, data engineers, system administrators, data administrators and architects to find opportunities to leverage DevOps technologies to process large volumes of data. The Data Platform Engineer will implement CI/CD workflows for Infrastructure as Code (IaC) and automated deployments. This role requires a motivated individual with strong technical ability, data capability, excellent communication, and collaboration skills including the ability to develop and troubleshoot a diverse range of problems.
Responsibilities:
• Implement continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) for data infrastructure using Terraform and GitHub Actions.
• Implement and deploy GitHub Actions tools and workflows.
• Implement various development, testing, automation, and data infrastructure tools
QUALIFICATIONS:
• Bachelor’s degree in computer science or engineering
• Minimum of 10 years of experience in DevOps engineering
• Git version control
• Minimum of 5 years of experience with GitHub Actions
• Minimum of 5 years of experience with Terraform
• Minimum of 5 years of AWS Cloud experience that includes: Lambda, EC2, and VPC
• Experience with big data platforms such as Amazon Redshift, Snowflake, Apache Iceberg, Apache Spark, AWS Glue, RDS
• Experience with containers: Docker, Amazon ECS, and/or Kubernetes
• Experience with data engineering, data analysis, and/or ETL
• Experience with programming languages such as Python
• Experience with OS-level scripting languages such as Bash and PowerShell
• Experience with use of advanced features of AI tools: ChatGPT, custom GPTs, and/or other models such as Claude Sonnet






