

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on an initial 6-month contract, paying £400-£450 per day. It requires strong SQL, experience with BigQuery and PostgreSQL, and cloud technologies. Remote work with occasional travel to Newcastle Upon Tyne is expected.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
520
-
🗓️ - Date discovered
September 28, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Data Analysis #Data Science #PostgreSQL #GCP (Google Cloud Platform) #Databases #Data Lineage #Kafka (Apache Kafka) #Python #Deployment #BI (Business Intelligence) #Scala #Data Engineering #Data Pipeline #"ETL (Extract #Transform #Load)" #Cloud #SQL (Structured Query Language) #BigQuery
Role description
GCP Data Engineer - £400-£450 per day
Initial 6-month contract | Remote-first (occasional travel to Newcastle Upon Tyne / 1 x a month)
Start date: ASAP
Outside IR35
Dcoded are partnering with a leading consultancy who are looking for a skilled GCP Data Engineer to join their team. This role offers the flexibility of working remotely, with the expectation of travelling to the North East around once a month.
The role
As a Data Engineer, you'll be responsible for designing, building, and maintaining modern, compliant, and scalable data solutions. You'll ensure seamless integration and movement of data across systems, working with cloud-based architectures and a wide range of tools including BigQuery, PostgreSQL, Python, and Kafka.
You'll play a key role in enabling high-quality data pipelines, improving data lineage, and translating business needs into technical solutions that deliver real insights.
Key responsibilities
• Lead by example in delivering reliable, efficient, and compliant data solutions
• Integrate, consolidate, and transmit data while maintaining high quality standards
• Design and deliver solutions that support business needs
• Build and maintain data services with tools such as BigQuery, PostgreSQL, Python, and Kafka
• Develop and enhance data pipelines in cloud-based environments, applying software engineering best practices
• Own complex feature development and act as a voice for best practice within the team
• Collaborate with senior stakeholders, translating business requirements into technical outcomes
• Drive continuous improvement in data lineage and quality
Essential experience
• Strong SQL querying and development expertise
• Experience with relational SQL and/or analytics databases (Postgres, Cassandra, BigQuery)
• Proven background in designing, developing, and deploying modern data solutions using best practice software engineering principles
• Solid understanding of data modelling, ETL processes, data warehousing, and both structured & unstructured data
• Confident working with cloud-based technologies and deployment infrastructure
• Awareness of how Data Engineering and Data Analysis fit within broader Data Science and BI workflows
• Excellent problem-solving, analytical, and decision-making skills
GCP Data Engineer - £400-£450 per day
Initial 6-month contract | Remote-first (occasional travel to Newcastle Upon Tyne / 1 x a month)
Start date: ASAP
Outside IR35
Dcoded are partnering with a leading consultancy who are looking for a skilled GCP Data Engineer to join their team. This role offers the flexibility of working remotely, with the expectation of travelling to the North East around once a month.
The role
As a Data Engineer, you'll be responsible for designing, building, and maintaining modern, compliant, and scalable data solutions. You'll ensure seamless integration and movement of data across systems, working with cloud-based architectures and a wide range of tools including BigQuery, PostgreSQL, Python, and Kafka.
You'll play a key role in enabling high-quality data pipelines, improving data lineage, and translating business needs into technical solutions that deliver real insights.
Key responsibilities
• Lead by example in delivering reliable, efficient, and compliant data solutions
• Integrate, consolidate, and transmit data while maintaining high quality standards
• Design and deliver solutions that support business needs
• Build and maintain data services with tools such as BigQuery, PostgreSQL, Python, and Kafka
• Develop and enhance data pipelines in cloud-based environments, applying software engineering best practices
• Own complex feature development and act as a voice for best practice within the team
• Collaborate with senior stakeholders, translating business requirements into technical outcomes
• Drive continuous improvement in data lineage and quality
Essential experience
• Strong SQL querying and development expertise
• Experience with relational SQL and/or analytics databases (Postgres, Cassandra, BigQuery)
• Proven background in designing, developing, and deploying modern data solutions using best practice software engineering principles
• Solid understanding of data modelling, ETL processes, data warehousing, and both structured & unstructured data
• Confident working with cloud-based technologies and deployment infrastructure
• Awareness of how Data Engineering and Data Analysis fit within broader Data Science and BI workflows
• Excellent problem-solving, analytical, and decision-making skills