

Senior GCP Engineer with Vertex AI and MLOps - Remote (US)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior GCP Engineer with Vertex AI and MLOps, remote (US) for a contract position. Requires 7+ years of GCP experience, expertise in Vertex AI, strong communication skills, and ability to work independently with data science teams.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Engineering #Data Pipeline #Data Science #Documentation #BigQuery #Libraries #Compliance #Data Quality #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #NoSQL #AI (Artificial Intelligence) #Leadership #ML (Machine Learning) #Cloud #GCP (Google Cloud Platform) #Python #Scala #Databricks #Automated Testing #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior GCP Engineer with Vertex AI and MLOps
Remote (Cincinnati, OH)
Contract
Required
β’ Senior GCP engineer with atleast 7 years on project experience.
β’ Worked with Vertex AI and ML/Ops.
β’ Can work independently with the data science team.
β’ Overall Communication needs to be good.
Description
β’ As a Sr. Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization.
β’ You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset.
β’ A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques.
Key Responsibilities
β’ Provide Technical Leadership
β’ Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets.
β’ Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms.
β’ Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries.
β’ Implement Automated Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality, reliability, and compliance with organizational standards.
β’ Optimize Data Workflows
β’ Mentor Team Members
β’ Draft and Review Documentation: Draft and review architectural diagrams, interface specifications, and other design documents to ensure clear communication of data solutions and technical requirements.
β’ Cost/Benefit Analysis: Present opportunities with cost/benefit analysis to leadership, guiding sound architectural decisions for scalable and efficient data solutions