TalentBridge

Data Platform Product Owner/Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Platform Product Owner/Analyst for a 6-month hybrid contract in Charlotte, NC, offering a competitive pay rate. Requires 10+ years in data management and 5+ years in GCP enterprise solutions and product management.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
376
-
πŸ—“οΈ - Date
April 4, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#BigQuery #Dataflow #Data Management #"ETL (Extract #Transform #Load)" #Cloud #Hadoop #Data Engineering #Leadership #Strategy #Storage #Data Strategy #Spark (Apache Spark) #Security #Data Ingestion #Data Architecture #Scala #Big Data #GCP (Google Cloud Platform) #IAM (Identity and Access Management)
Role description
JOB TITLE: Data Platform Product Owner/ Analyst LOCATION: Charlotte, NC 28202 (Hybrid) JOB TYPE:Β 6 Month Contract (Possibilities of extension or Conversion) GCP Data Platform Product Owner/Analyst The GCP Data Platform Product Owner within the Enterprise Data Platform (EDP) Product Management team is responsible for defining, prioritizing, and delivering enterprise-scale data capabilities on Google Cloud Platform (GCP). This role drives the strategic roadmap for the GCP Data Platform, partners with cross-functional stakeholders, and ensures alignment with enterprise data strategy, governance, and architectural standards. Key Responsibilities β€’ Own and define the product roadmap for the GCP Data Platform, covering data ingestion, transformation, storage, governance, and consumption. β€’ Partner with business stakeholders, product leaders, and engineering teams to translate business needs into scalable platform capabilities. β€’ Ensure alignment with enterprise architecture, governance, security, and regulatory standards. β€’ Manage and prioritize the product backlog, ensuring clear requirements aligned with strategic objectives. β€’ Enable modern data capabilities using GCP-native services such as BigQuery, Dataflow, Pub/Sub, Dataproc, Dataplex, Cloud Storage, and Composer. β€’ Develop and maintain a multi-year roadmap based on business value, investment priorities, and emerging technologies. β€’ Establish and promote reusable patterns across data ingestion, integration, and consumption. β€’ Collaborate with engineering teams to deliver scalable, reliable, and high-performance platform solutions. β€’ Communicate product strategy, roadmap, and progress to senior leadership and technical stakeholders. Required Qualifications β€’ 10+ years of experience in data management, data platforms, data engineering, or analytics. β€’ 5+ years of experience delivering enterprise data solutions on cloud or big data platforms (e.g., GCP, Hadoop, Spark). β€’ 5+ years of Product Owner or Product Management experience, including roadmap definition and MVP delivery. β€’ 3+ years of experience in data architecture or enterprise data platform design. Preferred Qualifications β€’ Strong knowledge of GCP data services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Dataplex, IAM). β€’ Experience delivering cloud-based data ingestion, ETL/ELT pipelines, analytics, and data consumption solutions. β€’ Strong stakeholder management, collaboration, and influencing skills across all organizational levels. β€’ Excellent communication skills with the ability to explain complex data concepts clearly. β€’ Proven ability to define product requirements, features, and user experiences. β€’ Ability to manage competing priorities and deliver in a fast-paced, evolving environment. β€’ Bachelor’s degree or higher.