GCP Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Modeler on a long-term contract, remote in the United States, offering W2 positions. Required skills include GCP expertise, data modeling techniques, SQL proficiency, and familiarity with ETL/ELT. GCP Professional Data Engineer Certification is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 19, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Vault #Storage #NoSQL #Data Modeling #Python #MongoDB #Physical Data Model #SQL (Structured Query Language) #BI (Business Intelligence) #Dataflow #Scala #Data Warehouse #GCP (Google Cloud Platform) #Data Integrity #Cloud #BigQuery #Looker #Scrum #Data Management #Microsoft Power BI #Data Vault #"ETL (Extract #Transform #Load)" #Agile #Metadata #Snowflake #Data Mart #Data Governance #Data Engineering #Scripting #Clustering #Kafka (Apache Kafka) #Tableau
Role description
Job Title: GCP Data Modeler Location: Remote in Untied States Duration: Long Term Contract Candidates who are willing to work on our W2 would be eligible. (W2 Candidates only) All Visa including US Citizens, Green Cards, TN Visa, H4 EAD and L2 EAD are eligible. Job Summary We are looking for a skilled GCP Data Modeler to design, build, and optimize data models and data structures on Google Cloud Platform. The role involves working closely with data engineers, analysts, and business stakeholders to ensure data models support reporting, analytics, and real-time processing needs. Key Responsibilities β€’ Design and develop conceptual, logical, and physical data models for GCP-based solutions. β€’ Implement and optimize data structures for BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage. β€’ Support data warehouse and data mart design for analytical and reporting use cases. β€’ Work with data engineers to ensure efficient ETL/ELT pipelines aligned with data models. β€’ Translate business requirements into scalable data models that support analytics and operations. β€’ Define naming conventions, standards, and metadata management practices. β€’ Optimize BigQuery tables, partitioning, clustering, and query performance. β€’ Ensure data integrity, consistency, and governance across different layers. β€’ Collaborate with business users, analysts, and architects to validate data modeling requirements. Required Skills & Qualifications β€’ Strong experience as a Data Modeler / Data Engineer with hands-on GCP exposure. β€’ Proficiency in data modeling techniques (3NF, star schema, snowflake schema, data vault). β€’ Expertise in BigQuery data modeling, partitioning, clustering, and performance tuning. β€’ Good understanding of ETL/ELT pipelines and integration with GCP services. β€’ Experience in data warehousing concepts, dimensional modeling, and OLAP systems. β€’ Proficiency in SQL and familiarity with Python or scripting languages. β€’ Knowledge of NoSQL modeling (Bigtable, Firestore, MongoDB) is a plus. β€’ Strong understanding of data governance, lineage, and metadata management. β€’ Excellent communication and collaboration skills. Preferred Qualifications β€’ GCP Professional Data Engineer Certification. β€’ Experience with streaming data (Pub/Sub, Kafka, Dataflow). β€’ Exposure to BI/Reporting tools (Looker, Tableau, Power BI). β€’ Familiarity with Agile/Scrum methodologies.