

TekWissen ®
- Data Engineer - Analytics
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Analytics on a 6-month fixed-term contract, offering £41,224.49-£78,034.81 per year. Key skills include Python, SQL, PySpark, and experience in data ingestion, quality management, and analytics solutions. Hybrid remote in London WC2N.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
354
-
🗓️ - Date
December 17, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
London WC2N
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #Data Pipeline #Data Modeling #Data Processing #Agile #SQL (Structured Query Language) #NiFi (Apache NiFi) #GIT #Apache NiFi #Apache Kafka #Data Architecture #Cloud #Data Engineering #Elasticsearch #Big Data #Data Cleansing #Python #Automation #PySpark #"ETL (Extract #Transform #Load)" #Logging #Apache Airflow #Datasets #Batch #Data Ingestion #Spark (Apache Spark) #Data Quality #Monitoring #Version Control #Data Lifecycle #Scala #Airflow
Role description
Role Overview
We are seeking a skilled Data Engineer / Data Analytics Engineer to design, build, and maintain scalable data pipelines and analytics solutions. The role focuses on data ingestion, transformation, quality management, and enabling high-quality analytics through robust data modeling and processing frameworks.
Key Responsibilities
Design, develop, and maintain end-to-end data ingestion and processing pipelines.
Build and optimize data models to support analytics and reporting use cases.
Ensure high standards of data quality, accuracy, and consistency across datasets.
Perform data cleansing, validation, and transformation on large-scale datasets.
Develop efficient and reusable data processing applications using Python and PySpark.
Integrate data from multiple sources using streaming and batch-processing frameworks.
Collaborate with analytics, product, and business teams to translate requirements into technical solutions.
Implement monitoring, logging, and performance tuning for data workflows.
Maintain code quality and version control using Git and best engineering practices.
Areas of Expertise
Data Analytics & Reporting Enablement
Data Modelling & Warehousing Concepts
Data Quality Management
Data Ingestion & Integration
Data Cleansing & Processing
Software Development for Data Platforms
Technical Skills (Required)
Python for data processing and automation
SQL for data querying and transformation
PySpark for large-scale data processing
Apache Airflow for workflow orchestration
Apache Kafka for real-time data streaming
Apache NiFi for data ingestion and flow management
Elasticsearch for search and analytics use cases
Git for version control and collaboration
Nice to Have
Experience with cloud-based data platforms
Knowledge of distributed systems and big data architectures
Familiarity with CI/CD pipelines for data applications
Experience & Qualifications
3+ years of experience in Data Engineering, Analytics Engineering, or a related role
Strong understanding of data pipelines, ETL/ELT processes, and data lifecycle management
Ability to work in an agile, fast-paced environment.
Job Type: Fixed term contractContract length: 6 months
Pay: £41,224.49-£78,034.81 per year
Work Location: Hybrid remote in London WC2N
Role Overview
We are seeking a skilled Data Engineer / Data Analytics Engineer to design, build, and maintain scalable data pipelines and analytics solutions. The role focuses on data ingestion, transformation, quality management, and enabling high-quality analytics through robust data modeling and processing frameworks.
Key Responsibilities
Design, develop, and maintain end-to-end data ingestion and processing pipelines.
Build and optimize data models to support analytics and reporting use cases.
Ensure high standards of data quality, accuracy, and consistency across datasets.
Perform data cleansing, validation, and transformation on large-scale datasets.
Develop efficient and reusable data processing applications using Python and PySpark.
Integrate data from multiple sources using streaming and batch-processing frameworks.
Collaborate with analytics, product, and business teams to translate requirements into technical solutions.
Implement monitoring, logging, and performance tuning for data workflows.
Maintain code quality and version control using Git and best engineering practices.
Areas of Expertise
Data Analytics & Reporting Enablement
Data Modelling & Warehousing Concepts
Data Quality Management
Data Ingestion & Integration
Data Cleansing & Processing
Software Development for Data Platforms
Technical Skills (Required)
Python for data processing and automation
SQL for data querying and transformation
PySpark for large-scale data processing
Apache Airflow for workflow orchestration
Apache Kafka for real-time data streaming
Apache NiFi for data ingestion and flow management
Elasticsearch for search and analytics use cases
Git for version control and collaboration
Nice to Have
Experience with cloud-based data platforms
Knowledge of distributed systems and big data architectures
Familiarity with CI/CD pipelines for data applications
Experience & Qualifications
3+ years of experience in Data Engineering, Analytics Engineering, or a related role
Strong understanding of data pipelines, ETL/ELT processes, and data lifecycle management
Ability to work in an agile, fast-paced environment.
Job Type: Fixed term contractContract length: 6 months
Pay: £41,224.49-£78,034.81 per year
Work Location: Hybrid remote in London WC2N






