GIOS Technology

Data Engineer - (AWS & Python / PySpark)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (AWS & Python/PySpark) in Glasgow, hybrid (2-3 days in-office), with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, PySpark, AWS services, and GitLab. Banking industry experience required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#VPC (Virtual Private Cloud) #PySpark #Unit Testing #Athena #Data Engineering #S3 (Amazon Simple Storage Service) #Spark (Apache Spark) #GitLab #Automation #IAM (Identity and Access Management) #SageMaker #Python #Programming #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Cloud
Role description
I am hiring for Data Engineer - (AWS & Python / PySpark) Location: Glasgow - Hybrid / 2-3 days Per week in Office • Strong programming background in Python and PySpark. • Hands-on experience with core AWS services including: • S3, Lambda, Glue, Step Functions, Athena, SageMaker, VPC, ECS, IAM, KMS • Proficiency in CloudFormation for infrastructure automation. • Solid understanding of unit testing frameworks and best practices. • Familiarity with GitLab for source control and CI/CD processes. Key Skills: Python / PySpark / AWS / Banking / S3 / Lambda / Sage Maker