Rivago Infotech Inc

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with ML knowledge, offering a long-term remote contract (preferably NY/NJ) at a competitive pay rate. Requires 3-5+ years of data engineering experience, GCP expertise, and proficiency in SQL and Python.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 6, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Google Cloud Storage #Dataflow #Monitoring #AI (Artificial Intelligence) #Batch #Apache Spark #Docker #Data Science #Kubernetes #ML (Machine Learning) #Computer Science #GCP (Google Cloud Platform) #Apache Airflow #Data Lifecycle #Airflow #Storage #Data Governance #GIT #Data Engineering #Data Security #"ETL (Extract #Transform #Load)" #BigQuery #Data Pipeline #Terraform #Data Warehouse #Deployment #SQL (Structured Query Language) #Data Modeling #Spark (Apache Spark) #Apache Beam #Infrastructure as Code (IaC) #Python #Data Management #IAM (Identity and Access Management) #Security #Automation #Programming #Cloud #Data Loss Prevention
Role description
Role: GCP Data Engineer with ML knowledge Location: Remote (Preferably NY/NJ) Duration: Long term Project Key Responsibilities Β· Pipeline Development & ETL: Design and deploy robust batch and streaming data pipelines using Cloud Dataflow (Apache Beam) and Cloud Pub/Sub. Β· Data Modeling & Warehouse: Construct and optimize data models in BigQuery for high-performance analytics and ML model consumption. Β· MLOps & Deployment: Operationalize ML models developed by data scientists, transitioning models from experimentation to production environments using Vertex AI. Β· Feature Engineering: Collaborate with data scientists to implement feature engineering pipelines that automate the extraction of features from raw data for training. Β· Data Security & Quality: Implement data governance, privacy, and security best practices (IAM, Data Loss Prevention) throughout the data lifecycle. Β· Automation: Automate data workflows and orchestration using Cloud Composer (Apache Airflow). Β· Monitoring & Optimization: Monitor pipeline performance using Cloud Monitoring and optimize for cost and speed. Required Qualifications: Β· Experience: 3-5+ years of experience in data engineering, with at least 2+ years focused on GCP. Β· Programming Skills: Expert-level SQL and strong Python programming skills. Β· GCP Expertise: Proven experience with Cloud function, Cloudrun, GCE, GKE, BigQuery, Dataflow, Dataproc, pub-sub, Google Cloud Storage, and Vertex AI. Β· Programming Skills: Expert-level SQL and strong Python programming skills. Β· ML Knowledge: Understanding of machine learning fundamentals (training, testing, evaluation, drift) and feature engineering techniques. Β· Strong understanding of SQL and unstructured data management. Β· Hand-on experience with Docker, Kubernetes (GKE), and CI/CD tools. Β· Infrastructure as Code: Experience with Terraform to provision and manage infrastructure. Β· Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Preferred Qualifications Certification: Β· Google Cloud - Professional Data Engineer Certification. Β· MLOps Specialization: Experience with Kubeflow or Vertex AI Pipelines. Β· Data Modeling: Strong understanding of data warehouse modeling patterns (Kimball/Inmon). Key Technologies: Β· GCP Core: Cloud function, Cloudrun, BigQuery, Dataflow, Pub/Sub, Composer, Dataproc, Vertex AI. Β· Languages: Python, SQL Β· Frameworks: Apache Beam, Apache Spark. Β· Tools: Terraform, Git, Docker, Kubernetes.