

GIOS Technology
Data Engineer(AWS/ Data Integration/ Data Modelling)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (AWS/Data Integration/Data Modelling) in Birmingham, offering a hybrid work model. Contract length and pay rate are unspecified. Key skills include SQL, Python, AWS tools, and finance-related data experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
376
-
🗓️ - Date
March 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Birmingham, England, United Kingdom
-
🧠 - Skills detailed
#Agile #SageMaker #AWS (Amazon Web Services) #Qlik #Data Vault #PySpark #Data Bricks #Python #Redshift #SAS #SSIS (SQL Server Integration Services) #Data Integration #Data Engineering #Informatica #Scala #S3 (Amazon Simple Storage Service) #Replication #Programming #Cloud #Spark (Apache Spark) #Vault #SQL (Structured Query Language)
Role description
I am hiring for Data Engineer(AWS/ Data integration/ Data modelling)
Location: Birmingham - Hybrid / 2 days Per week in Office
• Strong hands-on data engineering development experience.
• Advanced programming skills in SQL, Python, PySpark, and/or Scala.
• Experience with AWS data tooling such as S3, Glue, Redshift, and SageMaker (or equivalent cloud platforms).
• Experience with data integration and replication tools such as Qlik Replicate, Qlik Compose, Data bricks, Informatica, SAS, or SSIS.
• Experience working in a finance-related data environment.
• Experience working within Agile delivery teams.
• Familiarity with data modelling methodologies such as Kimball, Data Vault, and Lakehouse.
Key Skills: Data engineering / AWS data tooling / S3 / Glue / Redshift / Qlik / Data bricks / Informatica / SAS / SSIS
I am hiring for Data Engineer(AWS/ Data integration/ Data modelling)
Location: Birmingham - Hybrid / 2 days Per week in Office
• Strong hands-on data engineering development experience.
• Advanced programming skills in SQL, Python, PySpark, and/or Scala.
• Experience with AWS data tooling such as S3, Glue, Redshift, and SageMaker (or equivalent cloud platforms).
• Experience with data integration and replication tools such as Qlik Replicate, Qlik Compose, Data bricks, Informatica, SAS, or SSIS.
• Experience working in a finance-related data environment.
• Experience working within Agile delivery teams.
• Familiarity with data modelling methodologies such as Kimball, Data Vault, and Lakehouse.
Key Skills: Data engineering / AWS data tooling / S3 / Glue / Redshift / Qlik / Data bricks / Informatica / SAS / SSIS






