Back-End & Data Engineer (Creative Asset Pipeline)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Back-End & Data Engineer (Creative Asset Pipeline) on a contract basis, requiring expertise in Python or Node.js, AWS/GCP/Azure, relational/NoSQL databases, and search engines. Experience with vector databases and ML-based tagging is a plus.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
680
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Cloud #AWS (Amazon Web Services) #Azure #Metadata #GCP (Google Cloud Platform) #ML (Machine Learning) #AI (Artificial Intelligence) #PostgreSQL #Python #React #DynamoDB #Data Engineering #API (Application Programming Interface) #NoSQL #Databases #Storage #Elasticsearch
Role description
Beagle are seeking a hands-on back-end/data engineer to partner with our senior creative technologist on a high-volume Generative-AI programme. You’ll design and build the storage, metadata and search systems behind thousands of images and videos β€” everything from cloud object storage and databases to automated tagging pipelines and asset-management APIs. You should be fluent in Python or Node.js for API development, comfortable with AWS/GCP/Azure storage, relational/NoSQL databases and search engines (PostgreSQL, DynamoDB, Elasticsearch, Algolia), and able to integrate with React/Next.js front-ends. Experience with vector databases, content rights management or ML-based tagging is a plus.