AI/ML Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI/ML Data Engineer on a remote, full-time contract lasting over 6 months, offering competitive pay. Key skills include deep learning, NLP, and experience with large language models, TensorFlow, PyTorch, and Snowflake.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 27, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Fixed Term
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Cloud #Hugging Face #NLP (Natural Language Processing) #ML (Machine Learning) #PyTorch #Snowflake #Deep Learning #AI (Artificial Intelligence) #Libraries #Data Engineering #Transformers #Deployment #Data Science #Python #Classification #Scala #"ETL (Extract #Transform #Load)" #SpaCy #TensorFlow #Data Pipeline
Role description
A full-time contract ML/AI Data Engineer with a strong background in deep learning and natural language processing is required to support the development of proprietary models of meaning for private corporations and investment firms. The role involves designing, building, and optimizing scalable machine learning pipelines, with a particular focus on large language models (LLMs) and data engineering best practices. Working remotely, the successful candidate will collaborate with a team that includes pioneering expertise in natural language processing, contributing to innovative solutions that advance language understanding and applied AI. This is an excellent opportunity for a motivated professional to gain exposure to cutting-edge NLP research while applying practical skills to real-world data challenges. Deliverables β€’ Design and implement robust ML/AI data pipelines to support ongoing model development and deployment. β€’ Train, fine-tune, and evaluate large language models for specialized use cases. β€’ Apply deep learning techniques to NLP tasks such as semantic analysis, text classification, and information extraction. β€’ Optimize data workflows for performance and scalability using platforms such as Snowflake. β€’ Collaborate with cross-functional team members to translate research into production-ready solutions. β€’ Document processes, maintain code repositories, and ensure reproducibility of experiments and models. β€’ Provide insights and recommendations to enhance the efficiency and accuracy of proprietary NLP models. Requirements β€’ Strong academic or professional background in data science, machine learning, or AI engineering. β€’ Hands-on experience with deep learning frameworks such as TensorFlow or PyTorch. β€’ Expertise in natural language processing, with familiarity in techniques such as embeddings, transformers, and semantic modeling. β€’ Experience working with large language models, including training, fine-tuning, and evaluation. β€’ Proficiency in Python and common ML/NLP libraries (Hugging Face Transformers, spaCy, scikit-learn, etc.). β€’ Knowledge of Snowflake or other cloud-based data platforms for managing and optimizing data pipelines. β€’ Strong problem-solving skills and ability to work independently in a remote environment. β€’ Excellent communication skills for collaborating with both technical and non-technical stakeholders. β€’ Commitment to delivering high-quality, reproducible, and scalable solutions. About Twine Twine is a leading freelance marketplace connecting top freelancers, consultants, and contractors with companies needing creative and tech expertise. Trusted by Fortune 500 companies and innovative startups alike, Twine enables companies to scale their teams globally. Our Mission Twine's mission is to empower creators and businesses to thrive in an AI-driven, freelance-first world.