RSA Tech

Data Architect - GCP - (W2 Contract & Remote)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Platform Architect (Contract) focused on GCP, paying $80-$90/hour, remote for USC/GC applicants. Requires 8+ years IT experience, strong GCP skills, and expertise in data modeling, security, and BI. Preferred retail domain experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
February 18, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Leadership #EDW (Enterprise Data Warehouse) #Logging #Migration #Monitoring #Kanban #Metadata #Scala #Data Modeling #"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management) #BigQuery #BI (Business Intelligence) #Data Catalog #Data Architecture #Cloud #Jira #Automation #Data Quality #Agile #Security #Documentation #GCP (Google Cloud Platform) #AI (Artificial Intelligence)
Role description
Note: Currently we are unable to sponsor, We Encourage to apply only USC / GC. Senior Data Platform Architect (Contract) – GCP Job Summary: We’re looking for a senior, hands-on contract architect to accelerate our Enterprise Data Platform (EDP) on Google Cloud Platform (GCP). This role will partner closely with Architecture team and engineering teams to drive architecture and delivery across the platform, with a primary focus on our Enterprise Consumption Layer, the standardized analytics serving layer that delivers consumption-ready data products for BI and emerging AI/data-agent use cases. You’ll operate at the intersection of architecture + prototyping + pragmatic delivery, bringing a strong point of view, validating patterns through POCs, and helping teams implement solutions that scale. Key Responsibilities: · Lead rapid POCs and technical exploration to validate EDP and Enterprise Consumption Layer patterns (data modeling approach, performance/cost, security, operability). · Define and evolve architecture patterns and reference implementations for: o Building and maintaining the Enterprise Consumption Layer (enterprise-ready star schemas / dimensional models) o Data product structure, onboarding standards, and lifecycle (naming, documentation, quality expectations) o BI and downstream access patterns (including guardrails for scalable consumption) · Build hands-on prototypes on GCP when needed (you can implement, not just design). · Partner with architects, engineers, and product owners to design solutions that balance: o Speed vs durability o Innovation vs standardization o Incremental delivery vs long-term platform fit · Contribute to practical enablement of data/AI agent use cases where valuable (e.g., data discovery, metadata-driven workflows, quality checks, operational insights). · Provide technical leadership through design reviews, tradeoff decisions, documentation, and stakeholder communication. · Operate in a Kanban/Agile model using Jira, delivering measurable outcomes iteratively. Required Qualifications: · 8+ years overall IT experience with significant hands-on delivery in modern data platforms. · Strong experience building on Google Cloud Platform, including several of: o BigQuery (modeling, optimization, cost/performance patterns) o Dataform (or equivalent transformation framework) o GCS, Cloud Composer, Cloud Functions/Cloud Run, IAM o Logging/monitoring patterns (Cloud Logging/Monitoring) · Demonstrated ability to work across architecture + implementation: define a pattern, prototype it, and guide adoption. · Strong foundation in: o Dimensional modeling / star schemas, metric definitions, and consumption-ready data structures o Data quality mindset (what “enterprise-ready” looks like) o Security and access design (least privilege, domain/dataset-based access concepts) · Strong communication skills with the ability to explain tradeoffs to both technical and non-technical stakeholders. Preferred Qualifications: · Experience in retail / consumer / omni-channel domains (customer, product, inventory, orders, pricing, promotions, loyalty). · Familiarity with semantic layer concepts (even if not implementing one). · Experience with data catalog / metadata platforms and governance workflows. · Exposure to LLM-enabled/agentic tooling and automation patterns. · Prior work in platform modernization or EDW migration programs. · Experience defining enterprise standards (naming, dataset organization, data product onboarding, testing patterns). Team & Culture Fit: · Collaborative, outcome-driven, and comfortable with ambiguity and early-stage exploration. · Bias toward action: prototype fast, document clearly, scale what works. · Strong ownership: identifies gaps, proposes options, drives alignment and closure. Please share resume to uday@rsatechgroup.com Pay: $80.00 - $90.00 per hour Work Location: Remote