

Nasscomm
GCP Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Engineer on a 6+ month contract, offering remote work with up to 50% travel. Requires 5+ years in tech consulting, GCP implementation, SQL, Python, and experience with Snowflake, Databricks, and data management.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 6, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Snowflake #Python #Dataflow #Data Management #AWS (Amazon Web Services) #BigQuery #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Automation #Azure #Data Science #Cloud #Databricks #Kubernetes #Java #Spark (Apache Spark) #Consulting
Role description
Role: GCP Engineer
Location: Remote + Travel
Duration: 6+ Months contract
(Ability to travel up to 50% on average, based on the work you do and the clients and industries/sectors you serve)
Description :-
β’ Implement and manage large-scale data ecosystems, including governance frameworks and integration of structured and unstructured data, to generate actionable insights using cloud-based platforms.
β’ Apply automation, cognitive, and data science techniques to enhance data management, predict outcomes, and recommend strategic actions. Drive operational efficiency by maintaining robust data ecosystems, leveraging analytics expertise, and delivering continuous insights through As-a-Service models.
β’ Candidates should have 5+ years of experience in technology consulting or industry roles, with proven ability to contribute to end-to-end architecture and solutions using Google Cloud Platform (GCP).
β’ At least one full lifecycle implementation using GCP tools is required, along with 2+ years of experience in SQL and Python. Experience leading distributed teams across complex engagements is essential. A bachelorβs degree or equivalent experience is required, along with flexibility to travel up to 50%.
β’ Preferred qualifications include experience with Snowflake, Databricks, GCP Dataflow, Spark, Java, and tools such as BigQuery, Pub/Sub, and Kubernetes. GCP certifications, exposure to Azure/AWS, strong communication, problem-solving skills, and advanced degrees are advantageous.
Role: GCP Engineer
Location: Remote + Travel
Duration: 6+ Months contract
(Ability to travel up to 50% on average, based on the work you do and the clients and industries/sectors you serve)
Description :-
β’ Implement and manage large-scale data ecosystems, including governance frameworks and integration of structured and unstructured data, to generate actionable insights using cloud-based platforms.
β’ Apply automation, cognitive, and data science techniques to enhance data management, predict outcomes, and recommend strategic actions. Drive operational efficiency by maintaining robust data ecosystems, leveraging analytics expertise, and delivering continuous insights through As-a-Service models.
β’ Candidates should have 5+ years of experience in technology consulting or industry roles, with proven ability to contribute to end-to-end architecture and solutions using Google Cloud Platform (GCP).
β’ At least one full lifecycle implementation using GCP tools is required, along with 2+ years of experience in SQL and Python. Experience leading distributed teams across complex engagements is essential. A bachelorβs degree or equivalent experience is required, along with flexibility to travel up to 50%.
β’ Preferred qualifications include experience with Snowflake, Databricks, GCP Dataflow, Spark, Java, and tools such as BigQuery, Pub/Sub, and Kubernetes. GCP certifications, exposure to Azure/AWS, strong communication, problem-solving skills, and advanced degrees are advantageous.






