

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, focused on building complex data pipelines and cloud solutions. Contract length and pay rate are unspecified. Key skills include SQL, Python, Apache Spark, and cloud services.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
September 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
England, United Kingdom
-
π§ - Skills detailed
#Snowflake #dbt (data build tool) #DevOps #MongoDB #Monitoring #Redshift #Data Quality #Docker #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #Data Pipeline #Airflow #Automation #Schema Design #Kafka (Apache Kafka) #Data Engineering #Bash #Apache Spark #Python #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Logging #Hadoop #Spark (Apache Spark) #Big Data #Scripting #Azure #PostgreSQL #MySQL #Databases #Data Modeling #Cloud
Role description
Weβre seeking experienced #DataEngineers with 6+ years of experience and the Right to Work in the UK to join top projects with leading clients. If you have strong expertise in building complex data pipelines and cloud-based solutions, we want to hear from you!
Core Skills Weβre Looking For:
β’ ETL & Data Pipelines: SQL, Python, Apache Spark, Airflow
β’ Databases: MySQL, PostgreSQL, MongoDB, Redshift, Snowflake
β’ Data Warehousing & Big Data: Hadoop, AWS/GCP/Azure Data Services
β’ Cloud & DevOps Basics: AWS/GCP/Azure, Docker, CI/CD pipelines
β’ Data Modeling & Analytics: Dimensional modeling, schema design
β’ Scripting & Automation: Python, Bash
β’ Monitoring & Logging: Data quality checks, logging frameworks
π‘ Bonus: Knowledge of Kafka, DBT, or real-time streaming platforms
Weβre seeking experienced #DataEngineers with 6+ years of experience and the Right to Work in the UK to join top projects with leading clients. If you have strong expertise in building complex data pipelines and cloud-based solutions, we want to hear from you!
Core Skills Weβre Looking For:
β’ ETL & Data Pipelines: SQL, Python, Apache Spark, Airflow
β’ Databases: MySQL, PostgreSQL, MongoDB, Redshift, Snowflake
β’ Data Warehousing & Big Data: Hadoop, AWS/GCP/Azure Data Services
β’ Cloud & DevOps Basics: AWS/GCP/Azure, Docker, CI/CD pipelines
β’ Data Modeling & Analytics: Dimensional modeling, schema design
β’ Scripting & Automation: Python, Bash
β’ Monitoring & Logging: Data quality checks, logging frameworks
π‘ Bonus: Knowledge of Kafka, DBT, or real-time streaming platforms