UNICOM Technologies Inc

Senior GCP Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Modeler, requiring 10+ years of experience, onsite in Issaquah, WA, at $60/hr on C2C. Key skills include data modeling, cloud migration, and ETL tools, with a focus on GCP and Azure databases.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date
November 12, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Issaquah, WA
-
🧠 - Skills detailed
#Azure Data Factory #Data Security #"ETL (Extract #Transform #Load)" #ERWin #SQL (Structured Query Language) #Microsoft Power BI #Compliance #BitBucket #Databases #GIT #BI (Business Intelligence) #Jira #BigQuery #Agile #Normalization #Azure SQL #Microsoft Azure #GCP (Google Cloud Platform) #Dataflow #Metadata #Migration #Scrum #Deployment #Security #Database Design #Azure #Azure SQL Database #Cloud #Scala #Data Architecture #Version Control #Data Governance #Tableau #Business Analysis #Informatica #Physical Data Model #Visualization #Leadership #ADF (Azure Data Factory) #NoSQL #Data Modeling
Role description
Job Title: Senior Database Modeler Location: Onsite (Issaquah, WA) Experience Required: 10+ Years Rate:$60/Hr On C2C Role Overview: We are seeking an experienced Database Modeler to lead the data architecture and modeling efforts for a major replatforming project, migrating legacy IBM iSeries (AS/400) systems to modern cloud-based applications on Google Cloud Platform (GCP) and/or Microsoft Azure. The ideal candidate will have hands-on experience in data modeling, source-to-target mapping, and cloud-native database design. Strong collaboration skills and the ability to work with distributed teams are essential. Key Responsibilities: β€’ Design and develop logical and physical data models for cloud databases (Azure SQL Database, Cloud SQL, BigQuery, Spanner). β€’ Analyze iSeries (AS/400) data structures (physical/logical files, SQL tables, indexes, views) and create detailed mapping documents for migration. β€’ Define and enforce data standards, normalization rules, and naming conventions aligned with cloud best practices. β€’ Work closely with Business Analysts to interpret business rules and translate them into data models. β€’ Collaborate with cloud architects to ensure models align with GCP/Azure architecture, including networking, security, and scalability. β€’ Design schemas for high performance, leveraging cloud-native features (partitioning, indexes, materialized views). β€’ Work with ETL teams to support data transformation and migration using tools like Azure Data Factory, Google Dataflow, or Informatica. β€’ Maintain data dictionaries, metadata repositories, and version control for models. β€’ Document processes, data flows, and architectural decisions. β€’ Collaborate with onsite and offshore teams to ensure seamless delivery across time zones. Required Skills & Qualifications: β€’ 6+ years of experience in data modeling, database design, and migration projects. β€’ Strong hands-on experience with relational databases (Azure SQL, Cloud SQL, BigQuery). β€’ Proficiency with data modeling tools (Erwin, ER/Studio, SQL Database Modeler). β€’ Hands-on experience with cloud migration strategies, data security, and cost optimization. β€’ Experience with ETL tools (Azure Data Factory, Google Dataflow, Informatica). β€’ Solid understanding of data governance, compliance, and cloud security principles. β€’ Excellent communication and leadership skills. β€’ Experience working in Agile/Scrum environments. Preferred Tools & Technologies: β€’ Cloud Platforms: GCP, Azure β€’ Databases: Azure SQL, Cloud SQL, BigQuery, NoSQL (Cosmos DB, Firestore) β€’ Modeling Tools: Erwin, ER/Studio β€’ ETL: Azure Data Factory, Google Dataflow, Informatica β€’ Version Control: Git, Bitbucket β€’ Collaboration: Jira, Confluence, Slack, Microsoft Teams Nice to Have: β€’ Certifications in cloud platforms or data modeling. β€’ Experience with data visualization tools (Power BI, Tableau). β€’ Exposure to CI/CD pipelines for database deployments. β€’ Experience with retail or enterprise-scale applications.