

Cyber Space Technologies LLC
Sr DevOps Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr DevOps Architect in Boston, MA, on a contract basis. Requires 14+ years of experience, expertise in Terraform, GitHub Actions, AWS, and big data platforms. A Bachelor’s degree in computer science or engineering is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
January 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Containers #Kubernetes #ChatGPT #Apache Iceberg #Amazon ECS (Amazon Elastic Container Service) #Python #Deployment #Snowflake #RDS (Amazon Relational Database Service) #Spark (Apache Spark) #Big Data #Terraform #Scripting #Lambda (AWS Lambda) #Cloud #Infrastructure as Code (IaC) #Data Analysis #Automation #Apache Spark #Computer Science #EC2 #Programming #"ETL (Extract #Transform #Load)" #Amazon Redshift #DevOps #Version Control #Data Engineering #Docker #GIT #Redshift #GitHub #AWS Glue #AI (Artificial Intelligence) #VPC (Virtual Private Cloud) #AWS (Amazon Web Services) #Bash
Role description
Sr DevOps Architect
Boston, MA (4 days a week onsite)
Contract
Min 14+ Years experience.
Description:
• The Data Platform Engineering team supports CI/CD and infrastructure for our integrated Data Fabric platform. This Data Fabric is a strategic core asset underpinning the operational success of our Firm.
• The Data Platform Engineer will design and implement efficient procedures and pipelines for software development and infrastructure deployment, manage and deploy various key data systems and services.
• The Data Platform Engineer will work with cloud engineers, data engineers, system administrators, data administrators, and architects to find opportunities to leverage DevOps technologies to process large volumes of data.
• The Data Platform Engineer will implement CI/CD workflows for Infrastructure as Code (IaC) and automated deployments. This role requires a motivated individual with strong technical ability, data capability, excellent communication, and collaboration skills including the ability to develop and troubleshoot a diverse range of problems.
Responsibilities:
• Implement continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) for data infrastructure using Terraform and GitHub Actions.
• Implement and deploy GitHub Actions tools and workflows.
• Implement various development, testing, automation, and data infrastructure tools
QUALIFICATIONS:
• Bachelor’s degree in computer science or engineering
• Minimum of 10 years of experience in DevOps engineering
• Git version control
• Minimum of 5 years of experience with GitHub Actions
• Minimum of 5 years of experience with Terraform
• Minimum of 5 years of AWS Cloud experience that includes: Lambda, EC2, and VPC
• Experience with big data platforms such as Amazon Redshift, Snowflake, Apache Iceberg, Apache Spark, AWS Glue, RDS
• Experience with containers: Docker, Amazon ECS, and/or Kubernetes
• Experience with data engineering, data analysis, and/or ETL
• Experience with programming languages such as Python
• Experience with OS-level scripting languages such as Bash and PowerShell
• Experience with use of advanced features of AI tools: ChatGPT, custom GPTs, and/or other models such as Claude Sonnet
Regards,
Amit Kumar
Sr DevOps Architect
Boston, MA (4 days a week onsite)
Contract
Min 14+ Years experience.
Description:
• The Data Platform Engineering team supports CI/CD and infrastructure for our integrated Data Fabric platform. This Data Fabric is a strategic core asset underpinning the operational success of our Firm.
• The Data Platform Engineer will design and implement efficient procedures and pipelines for software development and infrastructure deployment, manage and deploy various key data systems and services.
• The Data Platform Engineer will work with cloud engineers, data engineers, system administrators, data administrators, and architects to find opportunities to leverage DevOps technologies to process large volumes of data.
• The Data Platform Engineer will implement CI/CD workflows for Infrastructure as Code (IaC) and automated deployments. This role requires a motivated individual with strong technical ability, data capability, excellent communication, and collaboration skills including the ability to develop and troubleshoot a diverse range of problems.
Responsibilities:
• Implement continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) for data infrastructure using Terraform and GitHub Actions.
• Implement and deploy GitHub Actions tools and workflows.
• Implement various development, testing, automation, and data infrastructure tools
QUALIFICATIONS:
• Bachelor’s degree in computer science or engineering
• Minimum of 10 years of experience in DevOps engineering
• Git version control
• Minimum of 5 years of experience with GitHub Actions
• Minimum of 5 years of experience with Terraform
• Minimum of 5 years of AWS Cloud experience that includes: Lambda, EC2, and VPC
• Experience with big data platforms such as Amazon Redshift, Snowflake, Apache Iceberg, Apache Spark, AWS Glue, RDS
• Experience with containers: Docker, Amazon ECS, and/or Kubernetes
• Experience with data engineering, data analysis, and/or ETL
• Experience with programming languages such as Python
• Experience with OS-level scripting languages such as Bash and PowerShell
• Experience with use of advanced features of AI tools: ChatGPT, custom GPTs, and/or other models such as Claude Sonnet
Regards,
Amit Kumar






