SWITS DIGITAL Private Limited

DevOps Engineer – Data Operations

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer – Data Operations, remote, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, Go, Kubernetes, GitLab CI/CD, Azure, and experience with data manipulation libraries.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Manipulation #GitLab #Cloud #Kubernetes #Observability #Snowflake #IAM (Identity and Access Management) #DevOps #Snowpark #Grafana #Prometheus #Libraries #YAML (YAML Ain't Markup Language) #Logging #Spark (Apache Spark) #Azure #Splunk #CLI (Command-Line Interface) #Data Science #Pandas #Python #ML (Machine Learning) #AWS (Amazon Web Services)
Role description
Job Title: DevOps Engineer – Data Operations Location: Remote Job Description Seeking a highly proficient developer to contribute to cloud-native application development using Python, Go, Kubernetes, Snowflake and Azure. The role demands deep integration with GitLab CI/CD pipelines and secure identity management via EntraID. Required Skills • High-level proficiency in Python and Go. • Strong experience with Kubernetes CLI and GitOps workflows. • Proven track record with GitLab CI/CD. • Hands-on experience with Azure or AWS cloud services. • Familiarity with Azure EntraID for identity and access management. • Solid understanding of containerization, networking, and distributed systems. • Demonstratable skills using Python for data manipulation with on of thee libraries like Pandas, Snowflake Snowpark, Polars, Spark Python etc. Preferred Qualifications • Hands-on experience building and distributing Python libraries with Pip • Experience with Helm and Jinja-based templating within YAML files. • Knowledge of Kubernetes networking • Familiarity with observability stacks and logging (Prometheus, Grafana, Splunk). • Experience working with Data Science teams • Previous experience integrating machine learning models into cloud services • Basic understanding of data science and statical analysis