MLops Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an MLOps Engineer (GCP Specialization) on a long-term remote contract, requiring 10+ years of experience, proficiency in Python, and expertise in GCP services. Strong knowledge of Terraform, Docker, and ML frameworks is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 3, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Denver, CO
-
🧠 - Skills detailed
#Kubernetes #BitBucket #Airflow #GCP (Google Cloud Platform) #Dataflow #Spark (Apache Spark) #Docker #"ETL (Extract #Transform #Load)" #PyTorch #ML (Machine Learning) #PySpark #Data Science #Monitoring #GitLab #TensorFlow #Scala #Programming #Computer Science #Deployment #AI (Artificial Intelligence) #Data Engineering #Terraform #Storage #BigQuery #Cloud #MLflow #Python
Role description
Hi Our client is looking for MLOps Engineer (GCP Specialization) for a long term project in Remote, below is the detailed requirement. Job Title : MLOps Engineer (GCP Specialization) Location : Remote Duration : Long term Contract Position Overview: The MLOps Engineer (GCP Specialization) is responsible for designing, implementing, and maintaining infrastructure and processes on Google Cloud Platform (GCP) to enable the seamless development, deployment, and monitoring of machine learning models at scale. This role bridges data science and data engineering, Infrastructure, ensuring that machine learning systems are reliable, scalable, and optimized for GCP environments. Job Description: β€’ Bachelor's degree in Computer Science or equivalent, with minimum 10+years of relevant experience. β€’ Must be Proficiency in programming languages such as Python. β€’ You must be expertise in GCP services, including Vertex AI, Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Cloud Storage, and Cloud Composer, Data proc or PySpark and managed Airflow. β€’ Strong experience with infrastructure-as-code - Terraform. β€’ Familiarity with containerization (Docker, GKE) and CI/CD pipelines, GitLab and Bitbucket. β€’ Knowledge of ML frameworks (TensorFlow, PyTorch, scikit-learn) and MLOps tools compatible with GCP (MLflow, Kubeflow) and Gen AI RAG applications. β€’ Understanding of data engineering concepts, including ETL pipelines with BigQuery and Dataflow, Dataproc - Pyspark. β€’ Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customers. β€’ Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail. β€’ Strong work ethic with good time management with ability to work with diverse teams