
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a contract until 31/10/2025, based in Birmingham (Hybrid), offering up to £414 p/d inside IR35. Key skills include DevOps, CICD, Python, SQL, ETL/ELT, and cloud/containerization experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
414
-
🗓️ - Date discovered
July 8, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Birmingham, England, United Kingdom
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Spark (Apache Spark) #API (Application Programming Interface) #Istio #Vault #Cloud #Data Engineering #MLflow #NoSQL #Pandas #NumPy #Documentation #Argo #Cloudera #GCP (Google Cloud Platform) #Bash #Groovy #Python #GIT #GitHub #Docker #Big Data #Airflow #IP (Internet Protocol) #SonarQube #Security #SQL (Structured Query Language) #Kubernetes #Delta Lake #DevOps #Jenkins #Flask #Linux #Monitoring #Hadoop #"ETL (Extract #Transform #Load)"
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role Title: Data Engineer
Duration: contract to run until 31/10/2025
Location: Birmingham, Hybrid
Rate: up to £414 p/d Umbrella inside IR35
Key Skills/ requirements
• Strong DevOps engineering background
• CICD pipeline
• Git / Github
• Github actions / hooks
• Jenkins
• SonarQube
• Linux, e.g. Redhat
• Groovy / Bash / Python scripts
• Nexus
Good understanding of Python library and application development
• Dependency management and package management, e.g. Poetry, pip
• Pandas and numpy
• API, e.g. Flask, Dash
• IDE (Pycharm, VsCode) and remote development
Good understanding of Data engineering
• ETL / ELT pipeline
• Spark
• Airflow
• SQL / no-SQL
• Delta lake
• Parquet
• Avro
• Partitioning
• Starburst
• S3 bucket
• Postgres
• MLFlow
Experience in Cloud and Containerization
• GCP / Internal cloud
• Docker / Kubernetes
• Argo CD
• Monitoring tooling, e.g. ELK
• Service Mesh, e.g. Istio
• Experience in secrets management, e.g. Vault
Good understanding of networking, security and operating system
• TCP/IP
• DNS
• SSH
• SSL/TLS
• Encryption and tokenization
• CPU and memory management
Experience of supporting Big Data infrastructure
• Hadoop cluster
• Spark cluster
• JDK/JVM
• Cloudera
• Strong problem solving and troubleshooting skills
• Take ownership and able to work independently
• Able to work under high pressure and in fast-paced environment.
• Good at documentation and knowledge sharing
• Able to communicate and work with multi-region / multi-cultural teams
All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!