MokshaaLLC

Looker Architect - San Ramon (5 Days ONSITE)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Looker Architect in San Ramon, CA, offering $75/hr+ for a contract position. Requires 10-14 years of analytics/BI experience, 4-5 years with Looker and BigQuery, and strong SQL skills. Looker and Google Cloud certifications preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date
November 8, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Ramon, CA
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Strategy #Observability #Data Architecture #Computer Science #BigQuery #API (Application Programming Interface) #dbt (data build tool) #Scala #GCP (Google Cloud Platform) #AI (Artificial Intelligence) #Data Access #Semantic Models #Automation #ML (Machine Learning) #Cloud #Business Analysis #Snowflake #Monitoring #Security #Data Mart #BI (Business Intelligence) #Migration #Looker #Data Pipeline #Data Engineering #Dataflow
Role description
Job Title: Looker + Google BigQuery Architect (On‐Site, San Ramon, CA) Location: San Ramon, CA – 5 days a week onsite Client Domain: Cloud Computing / IT Services / Enterprise Solutions domain. CONTRACTING : OPEN FOR all types of engagement (W2/1099/C2C) Pay Range: $75/hr onwards (Negotiable based on experience) Overview We are seeking a seasoned Looker + Google Cloud data architect to lead and shape our enterprise analytics platform. You will design end‐to‐end architecture, build scalable data models and dashboards, work closely with business stakeholders, data engineers and BI developers, and ensure high performance, security, governance and cost efficiency. If you are passionate about turning raw data into business insight, thrive in collaborative environments and want to drive analytics at scale, we’d love to talk to you. Key Responsibilities β€’ Architect and implement the full Looker + BigQuery stack: semantic layer, LookML models, reusable data components aligned to enterprise strategy. β€’ Lead development of dashboards and data models for key business areas (finance, sales, product). β€’ Partner with data engineering & pipeline teams to ensure data freshness, lineage and pipeline architecture feeding Looker. β€’ Implement row-level and column-level security, optimise BigQuery queries for performance and cost, establish monitoring/observability for the BI environment. β€’ Work closely with product owners and business analysts to translate business KPIs into data models and dashboards. Mentor and guide the Looker team and ensure alignment with data architecture standards. β€’ Innovate: evaluate and integrate advanced analytics/AI/ML within the Looker/BigQuery ecosystem, drive automation of reporting workflows, adopt modern Looker features (Blocks, API, embedded analytics). Required Skills and Experience β€’ 10–14 years of professional experience in analytics/BI, with minimum 4-5 years hands-on work in Looker + GCP (BigQuery/data pipeline stack). β€’ Strong proficiency in Looker (LookML modelling, Looker API), SQL optimisation and cost-efficient query structuring. β€’ Deep experience in Google BigQuery and related GCP services (Cloud Composer, Dataflow, Pub/Sub, Dataform or dbt). β€’ Proven track record designing scalable semantic models/data marts and leading BI architecture for enterprise use cases. β€’ Experience implementing BI governance, data access/security (row/column level), monitoring and SLAs in a production environment. β€’ Excellent communication, stakeholder management, mentoring and problem solving skills. β€’ Bachelor’s degree in Computer Science, Data Analytics, or similar (preferred). Preferred Qualifications β€’ Looker Certification (LookML Developer or Looker Business Analyst). β€’ Google Cloud Professional Data Engineer or Architect certification. β€’ Experience in Managed Service / AMS environment or data product operations. β€’ Exposure to Snowflake / dbt β†’ BigQuery migration is a plus.