Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, focused on building complex data pipelines and cloud solutions. Contract length and pay rate are unspecified. Key skills include SQL, Python, Apache Spark, and cloud services.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Snowflake #dbt (data build tool) #DevOps #MongoDB #Monitoring #Redshift #Data Quality #Docker #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #Data Pipeline #Airflow #Automation #Schema Design #Kafka (Apache Kafka) #Data Engineering #Bash #Apache Spark #Python #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Logging #Hadoop #Spark (Apache Spark) #Big Data #Scripting #Azure #PostgreSQL #MySQL #Databases #Data Modeling #Cloud
Role description
We’re seeking experienced #DataEngineers with 6+ years of experience and the Right to Work in the UK to join top projects with leading clients. If you have strong expertise in building complex data pipelines and cloud-based solutions, we want to hear from you! Core Skills We’re Looking For: β€’ ETL & Data Pipelines: SQL, Python, Apache Spark, Airflow β€’ Databases: MySQL, PostgreSQL, MongoDB, Redshift, Snowflake β€’ Data Warehousing & Big Data: Hadoop, AWS/GCP/Azure Data Services β€’ Cloud & DevOps Basics: AWS/GCP/Azure, Docker, CI/CD pipelines β€’ Data Modeling & Analytics: Dimensional modeling, schema design β€’ Scripting & Automation: Python, Bash β€’ Monitoring & Logging: Data quality checks, logging frameworks πŸ’‘ Bonus: Knowledge of Kafka, DBT, or real-time streaming platforms