Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
💱 - Currency
€ EUR
-
💰 - Day rate
-
🗓️ - Date discovered
September 5, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Croatia
-
🧠 - Skills detailed
#Data Pipeline #Storage #Deployment #Airflow #Data Engineering #Distributed Computing #Python #Security #Data Science #ML (Machine Learning) #Cloud #Data Quality #Monitoring #Database Management #Scala #Apache Airflow #Spark (Apache Spark)
Role description
NEW CONTRACT ROLE - Senior Data Engineer (Croatian, Bosnian, Serbian or Montenegrin and English-Speaking) Remote | Occasional PI planning on-site | Initial 6 Month Contract | Start October 2025 | Likely Extension | €600 EUR Per Day To apply, email: THE OPPORTUNITY Working with a large UK Consultancy on an exciting contract opportunity for a Senior Data Engineer. You will be joining a growing Data Engineering team, working alongside other data engineers and data scientists. You will help maintain and improve data quality and usability while also implementing machine learning algorithms and optimising pipelines. THE ROLE • Full-time engagement starting October 2025 until March 2026 • Remote role, with potential 1-2 day on-site PI plannings every 3 months in Zagreb, Croatia • Collaborate with a cross-functional team to deliver high-quality data engineering solutions • Improve and optimise data pipelines for efficiency, scalability, and reliability • Support machine learning initiatives and ensure seamless integration with data workflows TECH STACK / REQUIREMENTS • Excellent knowledge of Python • Basic understanding of other technology stacks (ability to identify issues) • Expert knowledge of at least one distributed computing framework (e.g. Spark) • Expert in at least one Database Management System & query language • Strong understanding of cloud-based storage solutions • Good knowledge of architecture patterns • Solid understanding of Data Analytics techniques • Ability to set up data deployment pipelines • Expert in writing tests and ensuring code quality • Good knowledge of best security and privacy patterns • Experience with at least one workflow scheduling tool (e.g. Apache Airflow) • Knowledge of monitoring tools implementation TO BE CONSIDERED… Please apply directly by emailing with your CV and availability. KEYWORDS: Python, Spark, Data Engineering, Pipelines, Database, Cloud Storage, Architecture Patterns, Data Analytics, Machine Learning, Airflow, Monitoring Tools, Security, Privacy
No items found.