Remobi

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "unknown." It requires strong AWS and DynamoDB skills, experience in building ETL pipelines, and familiarity with BI tools like AWS QuickSight. Remote work is offered.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 27, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Data Quality #AI (Artificial Intelligence) #NoSQL #Indexing #Data Pipeline #Version Control #AWS (Amazon Web Services) #Cloud #GIT #BI (Business Intelligence) #Network Security #"ETL (Extract #Transform #Load)" #Amazon Neptune #Databases #Datasets #API (Application Programming Interface) #Documentation #Scala #Cybersecurity #AWS Glue #Lambda (AWS Lambda) #Graph Databases #Pandas #S3 (Amazon Simple Storage Service) #Data Engineering #Security #Python #DynamoDB #IAM (Identity and Access Management) #Neo4J #Athena
Role description
Are you a Senior Data Engineer with strong AWS and DynamoDB experience who enjoys building data platforms from the ground up? This role focuses on building an AI-powered analytics layer on top of a network security data platform, helping turn complex operational data into clear, actionable insights. You’ll be working with large volumes of network and security log data from industrial environments, designing the core data infrastructure and enabling visibility through BI dashboards used by stakeholders. Your Role as a Senior Data Engineer: You will take ownership of designing and implementing the data layer that underpins security analytics use cases. This includes structuring high-volume event data, building reliable pipelines, and enabling downstream analytics. This is a hands-on role, working closely with an AI/Analytics specialist and domain experts to ensure the data platform is scalable, reliable, and aligned with real-world security requirements. What You’ll Do: β€’ Design and implement a DynamoDB database for high-volume, time-series network and security data β€’ Define table structures, partition and sort keys, and indexing strategies for performance and scalability β€’ Optimise cost and performance using DynamoDB features (capacity modes, TTL, streams, indexing strategies) β€’ Build robust ETL pipelines to ingest data from Excel, CSVs, and API sources β€’ Clean, validate, and normalise complex, multi-source security data β€’ Automate pipeline orchestration, scheduling, and error handling β€’ Connect the data layer to AWS QuickSight and build interactive dashboards β€’ Design datasets, calculated fields, and visualisations to surface actionable insights β€’ Collaborate with domain experts and stakeholders to translate requirements into data solutions β€’ Produce clear technical documentation (schemas, pipelines, data definitions, runbooks) What You’ll Bring: β€’ Strong hands-on experience with DynamoDB (data modelling, GSIs, performance tuning) β€’ Proven experience building ETL/ELT pipelines in Python (pandas, boto3, etc.) β€’ Solid AWS experience (S3, Lambda, IAM, CloudWatch and related services) β€’ Experience with AWS QuickSight for dashboarding and BI delivery β€’ Strong understanding of NoSQL data modelling and high-throughput data systems β€’ Experience working with messy, multi-format data and implementing data quality checks β€’ Strong Python skills with clean coding practices and version control (Git) β€’ Ability to communicate clearly and work with both technical and non-technical stakeholders Nice to Have: β€’ Experience with AWS Glue and Athena β€’ Exposure to graph databases (Neo4j, Amazon Neptune) β€’ Experience with CI/CD for data pipelines β€’ Background in cybersecurity, network data, or similar domains β€’ Understanding of event-based data structures (process mining or similar) β€’ Experience using AI-assisted development tools Why Join the Remobi community: β€’ Fully/90% remote, b2b/freelance contracts β€’ Opportunity to work on high-impact data and AI programmes for largescale enterprise clients. β€’ Access to a network of experienced data, AI, and engineering professionals πŸ‘‰ Apply today: If you’re comfortable building data platforms for complex, real-world use cases and want to work on security-focused analytics, apply now!