N Consulting Global

Sr. Cloud Data DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Cloud Data DevOps Engineer in Glasgow, UK, offering £400/day for a 6-month contract. Key skills include Python, AWS, Terraform, and experience with data platforms like Snowflake and Databricks. Hybrid work (2 days/week in office) required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
400
-
🗓️ - Date
May 1, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow City, Scotland, United Kingdom
-
🧠 - Skills detailed
#Databricks #AWS (Amazon Web Services) #Security #Data Pipeline #Docker #Spark (Apache Spark) #dbt (data build tool) #Infrastructure as Code (IaC) #Kubernetes #Scala #Terraform #Data Engineering #Strategy #Snowflake #Airflow #Cloud #DevOps #Python #Data Strategy
Role description
We are actively looking for an experienced Sr. Cloud Data DevOps Engineer to join a high-impact data engineering team working on modern, scalable data platforms. Job Role: Sr. Cloud Data DevOps Engineer Location: Glasgow, UK Work Mode: Hybrid (2 days/week in office) Rate: £400/day Role Overview: This is a hands-on role where you’ll be building and optimizing data platforms, APIs, and pipelines in a cloud-native environment. You’ll collaborate with cross-functional teams to deliver secure, scalable, and high-performing data solutions at enterprise scale. Key Responsibilities: • Design & implement infrastructure for data engineering roadmaps • Build and deploy enterprise-grade data products & APIs • Develop scalable and robust data pipelines • Work closely with architects, engineers, and stakeholders across teams • Ensure strong governance, security, and controls in all solutions Must-Have Skills: • Strong experience with Python / Spark • Expertise in AWS Cloud • Hands-on with Terraform / CloudFormation (IaC) • Experience with Snowflake / Databricks / Airflow / DBT • Solid understanding of DevOps practices (GitOps, Kubernetes, Docker) • Experience building distributed data systems ⭐ Nice to Have: • Strong background in Data Engineering products/platforms • Exposure to enterprise-scale data strategy & pipelines