

AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in London for a 6-month contract, paying £225 to £245 per day. Requires 2+ years of data engineering experience, proficiency in Python and SQL, and familiarity with big data tools and cloud platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
245
-
🗓️ - Date discovered
June 13, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#API (Application Programming Interface) #Graph Databases #Data Engineering #Java #Spark (Apache Spark) #Airflow #Data Integration #AWS (Amazon Web Services) #Scala #Mathematics #BI (Business Intelligence) #MongoDB #Kafka (Apache Kafka) #Big Data #Databases #Data Quality #Python #Azure #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Hadoop #R #NoSQL #GCP (Google Cloud Platform) #Luigi #PostgreSQL #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We're Hiring: AWS Data Engineer
Location: London (2-3 days in office)
Experience: 2+ Years
Degree: STEM/Business
Rate: £225 to 245 a day Umbrella
Duration: 6 months contract
We're looking for a Data Engineer to help power our innovation engine. You’ll design data models, build scalable ETL pipelines, codify business logic, and drive data integration across complex systems—structured and unstructured alike.
This is your chance to turn raw data into real business value using cutting-edge tech in a collaborative, forward-thinking team.
What You’ll Do:
• Design & implement data models and scalable ETL/ELT pipelines
• Map data sources, codify business logic, and build data flows
• Develop data quality solutions & explore new technologies
• Collaborate with analysts, developers, and business stakeholders
What You Bring:
• 2+ years in data engineering or related roles
• Bachelor’s in CS, Engineering, Mathematics, Finance, etc.
• Proficiency in Python, SQL, and one or more: R, Java, Scala
• Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB)
• Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi)
• Bonus: experience with BI tools, API integrations, and graph databases
Why Join Us?
• Work with large-scale, high-impact data
• Solve real-world problems with a top-tier team
• Flexible, fast-paced, and tech-forward environment
Apply now and help us build smarter, data-driven solutions.
#TechCareers #Innovation #Python #SQL #Spark #Kafka #Hadoop#DataEngineer #ETLDeveloper#BigDataEngineer#DataEngineering#AnalyticsJobs
#HiringNow#JobOpening#Careers