Aptimized

Google Cloud Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Architect with 10+ years of experience, including 3+ years in Big Data. Remote position in the USA, minimum 6-month contract, open bill rate. Key skills: SQL, Python, ERD, GCP, DBT. US Citizen or Greencard Holder required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
16
-
🗓️ - Date
April 28, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Warehouse #Cloud #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Visualization #Data Modeling #GCP (Google Cloud Platform) #Data Ingestion #Security #Data Mining #Python #Looker #Big Data #SQL (Structured Query Language) #Storage #Data Pipeline #Data Architecture
Role description
SQL, Python, ERD, GCP, and DBT. Position Name: Big Data Architect Location: Remote, USA (Client is from PT Zone, Someone close to that zone would be nice to have, but it's not mandatory.) Experience required: 10+ Years, 3+ years in Big data Must Bill Rate: Open Duration: Minimum 6 months, with possible extension based on performance. Looking for candidates who do not require visa sponsorship. Must be US Citizen or Greencard Holder Client Name: TV Station Job Description: • Design and optimize conceptual and logical database models. • Analyze system requirements, implement data strategies, and ensure efficiency and security. • Expertise in SQL, Python, ERD, GCP, DBT is essential. • In-depth understanding of database structure principles. • Deep knowledge of data mining and segmentation techniques. • Familiarity with data visualization tools (Looker). • Building and maintaining big data ETL pipelines using DBT, GCP Services, Python and SQL. • Building, managing and maintaining data ingestion processes landing data into Google Storage, Big Query. • Expertise in implementing and maintaining data pipelines. • Enhancing and maintaining the analytics data warehouse on Big Query. • Improve system performance by conducting tests, troubleshooting, and integrating new elements. Desirable: 10+ years in data modeling with 3+ years in Big data (100TB+)