

GIOS Technology
ETL/Data Engineer (Python & Data Bricks)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL/Data Engineer (Python & Data Bricks) in Glasgow, hybrid (3 days in-office). Contract length and pay rate are unspecified. Key skills include Python, Data Bricks, ETL, Power BI, and experience with big data technologies.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#Hadoop #Data Modeling #GIT #Scala #Data Pipeline #Spark (Apache Spark) #Airflow #Python #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Database Administration #Data Orchestration #Big Data #Cloud #Visualization #Version Control #BI (Business Intelligence) #Apache Airflow #Data Bricks #Data Engineering
Role description
I am hiring for ETL/Data Engineer (Python & Data bricks)
Location: Glasgow - Hybrid / 3 days Per week in Office
• Strong proficiency in Python, with the ability to write efficient and maintainable code.
• Hands-on experience with Data bricks and cloud platforms for building scalable data pipelines.
• Solid understanding of ETL principles, data modeling, warehousing concepts, and integration best practices..
• Experience using version control tools (e.g., Git).
• Experience with data visualization tools (e.g., Power BI).
• Background in database administration or performance tuning.
• Familiarity with data orchestration tools such as Apache Airflow.
• Exposure to big data technologies like Hadoop or Spark.
Key Skills: Python / Data bricks / ETL / Power BI / Data visualization / Database administration
I am hiring for ETL/Data Engineer (Python & Data bricks)
Location: Glasgow - Hybrid / 3 days Per week in Office
• Strong proficiency in Python, with the ability to write efficient and maintainable code.
• Hands-on experience with Data bricks and cloud platforms for building scalable data pipelines.
• Solid understanding of ETL principles, data modeling, warehousing concepts, and integration best practices..
• Experience using version control tools (e.g., Git).
• Experience with data visualization tools (e.g., Power BI).
• Background in database administration or performance tuning.
• Familiarity with data orchestration tools such as Apache Airflow.
• Exposure to big data technologies like Hadoop or Spark.
Key Skills: Python / Data bricks / ETL / Power BI / Data visualization / Database administration






