Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in London (hybrid, 2-3 days in office) on a 6-month contract, paying £225 to £245 daily. Requires 2+ years of experience, a STEM/business degree, and skills in Python, SQL, and big data tools.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
245
🗓️ - Date discovered
April 23, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Inside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#Data Engineering #Airflow #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Integration #NoSQL #Databases #Graph Databases #Python #MongoDB #PostgreSQL #R #Big Data #Scala #Azure #Kafka (Apache Kafka) #Luigi #Hadoop #Data Quality #Mathematics #Spark (Apache Spark) #SQL (Structured Query Language) #AWS (Amazon Web Services) #Cloud #API (Application Programming Interface) #Java
Role description

We're Hiring: Data Engineer

Location: London (2-3 days in office)

Experience: 2+ Years

Degree: STEM/Business

Duration: 6 months contract

Rate: £225 to 245 a day Umbrella

We're looking for a Data Engineer to help power our innovation engine. You’ll design data models, build scalable ETL pipelines, codify business logic, and drive data integration across complex systems—structured and unstructured alike.

This is your chance to turn raw data into real business value using cutting-edge tech in a collaborative, forward-thinking team.

What You’ll Do:

   • Design & implement data models and scalable ETL/ELT pipelines

   • Map data sources, codify business logic, and build data flows

   • Develop data quality solutions & explore new technologies

   • Collaborate with analysts, developers, and business stakeholders

What You Bring:

   • 2+ years in data engineering or related roles

   • Bachelor’s in CS, Engineering, Mathematics, Finance, etc.

   • Proficiency in Python, SQL, and one or more: R, Java, Scala

   • Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB)

   • Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi)

   • Bonus: experience with BI tools, API integrations, and graph databases

Why Join Us?

   • Work with large-scale, high-impact data

   • Solve real-world problems with a top-tier team

   • Flexible, fast-paced, and tech-forward environment

Apply now and help us build smarter, data-driven solutions.

#TechCareers #Innovation #Python #SQL #Spark #Kafka #Hadoop#DataEngineer #ETLDeveloper#BigDataEngineer#DataEngineering#AnalyticsJobs

#HiringNow#JobOpening#Careers