

GIOS Technology
Cloud Data Engineer (Snowflake/AWS/Python/Pyspark)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer (Snowflake/AWS/Python/PySpark) in Glasgow, hybrid (2-3 days/week). Contract length and pay rate are unspecified. Key skills include AWS, Terraform, Snowflake, Airflow, and Python. Experience with CI/CD and big data architecture is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
376
-
🗓️ - Date
April 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#Databricks #Big Data #Data Modeling #Programming #AWS (Amazon Web Services) #Data Lakehouse #Cloud #Apache Iceberg #Apache Spark #Kubernetes #Data Architecture #AI (Artificial Intelligence) #Airflow #dbt (data build tool) #Snowflake #"ETL (Extract #Transform #Load)" #Python #Spark (Apache Spark) #Data Engineering #Data Lake #DevOps #Delta Lake #SnowPipe #Terraform #PySpark #Deployment
Role description
I am hiring for Cloud Data Engineer (Snowflake/AWS/Python/Pyspark)
Location: Glasgow - Hybrid / 2-3 days Per week
• Strong experience in AWS services and cloud infrastructure management.
• Proficiency in Terraform, including modules, providers, and enterprise usage.
• Hands-on experience with Snowflake and its Python SDK.
• Solid experience with Airflow, DBT, Apache Spark/PySpark, and Databricks.
• Strong Python programming skills for data engineering tasks.
• Experience with CI/CD pipelines, DevOps practices, and Kubernetes orchestration.
• Knowledge of big data architecture, data modeling, and ETL/ELT best practices.
Preferred Skills
• Familiarity with Snowflake advanced features: external tables, Snowpipe, Cortex AI, PrivateLink.
• Experience with Astronomer Airflow deployment.
• Exposure to Delta Lake, Apache Iceberg, and modern data lakehouse architecture.
Key Skills: AWS / Terraform / Python / Airflow / Apache Spark / cloud infrastructure management / Snowflake
I am hiring for Cloud Data Engineer (Snowflake/AWS/Python/Pyspark)
Location: Glasgow - Hybrid / 2-3 days Per week
• Strong experience in AWS services and cloud infrastructure management.
• Proficiency in Terraform, including modules, providers, and enterprise usage.
• Hands-on experience with Snowflake and its Python SDK.
• Solid experience with Airflow, DBT, Apache Spark/PySpark, and Databricks.
• Strong Python programming skills for data engineering tasks.
• Experience with CI/CD pipelines, DevOps practices, and Kubernetes orchestration.
• Knowledge of big data architecture, data modeling, and ETL/ELT best practices.
Preferred Skills
• Familiarity with Snowflake advanced features: external tables, Snowpipe, Cortex AI, PrivateLink.
• Experience with Astronomer Airflow deployment.
• Exposure to Delta Lake, Apache Iceberg, and modern data lakehouse architecture.
Key Skills: AWS / Terraform / Python / Airflow / Apache Spark / cloud infrastructure management / Snowflake






