

Gardner Resources Consulting, LLC
Machine Learning Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Machine Learning Engineer with 5+ years of experience in ML model development, GCP & Vertex AI expertise, and strong Python skills. Remote work is available (EST hours), with an in-person interview required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#BigQuery #GCP (Google Cloud Platform) #Model Evaluation #AI (Artificial Intelligence) #Data Framework #Dataflow #ML (Machine Learning) #Cloud #Databases #MongoDB #NLP (Natural Language Processing) #Kafka (Apache Kafka) #Scala #NoSQL #GIT #Spark (Apache Spark) #Data Science #Deployment #Docker #Python #RDBMS (Relational Database Management System) #Big Data #Kubernetes
Role description
Machine Learning Engineer
Remote (EST hours)
•
• In-Person Interview Required
• 5+ years designing, building, and deploying ML models, including data preprocessing, feature engineering, model evaluation, and production ML workflows; experience with LLMs, RAGs, and NLP is a plus.
• Must have experience with GCP & Vertex AI: Hands-on experience developing and scaling ML solutions on Google Cloud Platform, including Vertex AI for training, deployment, and model lifecycle management.
• Strong Python development with experience working across structured and unstructured data using BigQuery, RDBMS, and NoSQL databases such as MongoDB; familiarity with big data frameworks (Spark, Ray, Dask) and streaming tools (Kafka, Pub/Sub) preferred.
• Experience integrating ML into production systems using Docker, Kubernetes (GKE), Dataflow, and Git, with a collaborative approach alongside data scientists, software engineers, and product teams to deliver scalable, high-performance solutions.
Machine Learning Engineer
Remote (EST hours)
•
• In-Person Interview Required
• 5+ years designing, building, and deploying ML models, including data preprocessing, feature engineering, model evaluation, and production ML workflows; experience with LLMs, RAGs, and NLP is a plus.
• Must have experience with GCP & Vertex AI: Hands-on experience developing and scaling ML solutions on Google Cloud Platform, including Vertex AI for training, deployment, and model lifecycle management.
• Strong Python development with experience working across structured and unstructured data using BigQuery, RDBMS, and NoSQL databases such as MongoDB; familiarity with big data frameworks (Spark, Ray, Dask) and streaming tools (Kafka, Pub/Sub) preferred.
• Experience integrating ML into production systems using Docker, Kubernetes (GKE), Dataflow, and Git, with a collaborative approach alongside data scientists, software engineers, and product teams to deliver scalable, high-performance solutions.






