

Culinovo
GCP Solution Architect (Data Platform)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Solution Architect (Data Platform) in New York City, NY, on a W2 contract. Requires 12+ years in IT, expertise in GCP, data engineering, and utilities domain experience. LinkedIn profile mandatory; GCP certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 3, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Engineering #Programming #Azure #Leadership #GCP (Google Cloud Platform) #Databricks #Automation #Data Pipeline #Python #Migration #Dataflow #ML (Machine Learning) #Data Warehouse #BigQuery #Scala #Security #SQL (Structured Query Language) #Compliance #Scripting #AI (Artificial Intelligence) #Batch #Cloud
Role description
Job Title: GCP Solution Architect (Data Platform)
Location: Onsite New York City, NY
Employment Type: W2 Only
Work Authorization: U.S. Citizens / Green Card Holders only
Experience Required: Minimum 12+ Years in IT
LinkedIn Profile: Mandatory
Require Candidates on our W2 only
Job Summary
We are seeking an experienced GCP Solution Architect (Data Platform) to lead the design, development, and implementation of large-scale cloud data platforms. The ideal candidate will have deep expertise in Google Cloud Platform, strong architectural experience, and hands-on exposure to modern data engineering, analytics, AI/ML, and regulatory compliance within the utilities domain.
Key Responsibilities
• Lead end-to-end architecture, design, and delivery of scalable GCP-based data platforms.
• Define cloud migration strategies, including re-platforming and re-architecting data workloads.
• Architect, optimize, and implement solutions using Vertex AI, BigQuery, Dataflow, Pub/Sub, and other GCP-native services.
• Build and support batch and streaming data pipelines, data warehouses, and analytics frameworks.
• Integrate AI/ML solutions into data platforms to enhance intelligence and automation.
• Collaborate with cross-functional teams to ensure cloud architecture aligns with business and security requirements.
• Evaluate and compare Azure & Databricks capabilities to guide solution tradeoffs and ensure compatibility with GCP solutions.
• Ensure all solutions adhere to regulatory and security standards, including NERC CIP and other utilities governance frameworks.
• Provide architectural assessments, recommendations, technical leadership, and best practices across cloud data engineering initiatives.
Required Skills & Experience
• 12+ years of overall IT experience with extensive cloud architecture exposure.
• Strong hands-on experience with GCP Architecture, GCP migrations, and data platform modernization.
• Proficiency with BigQuery, Dataflow, Pub/Sub, Vertex AI, and cloud-native data analytics services.
• Expertise with data engineering concepts including batch/streaming pipelines, data warehouses, and distributed processing.
• Experience with AI/ML integration within cloud ecosystems.
• Working knowledge of Azure and Databricks, with ability to analyze tradeoffs and design hybrid solutions.
• Familiarity with NERC CIP, security, compliance frameworks, and governance processes for utilities.
• Excellent communication, leadership, and stakeholder management skills.
• LinkedIn profile is mandatory for submission.
Preferred Qualifications
• GCP Professional Certifications (Cloud Architect, Data Engineer, Machine Learning Engineer).
• Experience in the utilities or energy domain.
• Strong scripting or programming knowledge (Python, SQL).
Job Title: GCP Solution Architect (Data Platform)
Location: Onsite New York City, NY
Employment Type: W2 Only
Work Authorization: U.S. Citizens / Green Card Holders only
Experience Required: Minimum 12+ Years in IT
LinkedIn Profile: Mandatory
Require Candidates on our W2 only
Job Summary
We are seeking an experienced GCP Solution Architect (Data Platform) to lead the design, development, and implementation of large-scale cloud data platforms. The ideal candidate will have deep expertise in Google Cloud Platform, strong architectural experience, and hands-on exposure to modern data engineering, analytics, AI/ML, and regulatory compliance within the utilities domain.
Key Responsibilities
• Lead end-to-end architecture, design, and delivery of scalable GCP-based data platforms.
• Define cloud migration strategies, including re-platforming and re-architecting data workloads.
• Architect, optimize, and implement solutions using Vertex AI, BigQuery, Dataflow, Pub/Sub, and other GCP-native services.
• Build and support batch and streaming data pipelines, data warehouses, and analytics frameworks.
• Integrate AI/ML solutions into data platforms to enhance intelligence and automation.
• Collaborate with cross-functional teams to ensure cloud architecture aligns with business and security requirements.
• Evaluate and compare Azure & Databricks capabilities to guide solution tradeoffs and ensure compatibility with GCP solutions.
• Ensure all solutions adhere to regulatory and security standards, including NERC CIP and other utilities governance frameworks.
• Provide architectural assessments, recommendations, technical leadership, and best practices across cloud data engineering initiatives.
Required Skills & Experience
• 12+ years of overall IT experience with extensive cloud architecture exposure.
• Strong hands-on experience with GCP Architecture, GCP migrations, and data platform modernization.
• Proficiency with BigQuery, Dataflow, Pub/Sub, Vertex AI, and cloud-native data analytics services.
• Expertise with data engineering concepts including batch/streaming pipelines, data warehouses, and distributed processing.
• Experience with AI/ML integration within cloud ecosystems.
• Working knowledge of Azure and Databricks, with ability to analyze tradeoffs and design hybrid solutions.
• Familiarity with NERC CIP, security, compliance frameworks, and governance processes for utilities.
• Excellent communication, leadership, and stakeholder management skills.
• LinkedIn profile is mandatory for submission.
Preferred Qualifications
• GCP Professional Certifications (Cloud Architect, Data Engineer, Machine Learning Engineer).
• Experience in the utilities or energy domain.
• Strong scripting or programming knowledge (Python, SQL).






