

GCP Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect, a long-term, 100% remote position requiring media industry experience and Google Cloud Architect Certification. Candidates must have expertise in SQL, Python, GCP services, and 10+ years in data modeling with recent big data experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
636.3636363636
-
ποΈ - Date discovered
June 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Python #Data Architecture #BigQuery #Data Mining #Cloud #GCP (Google Cloud Platform) #dbt (data build tool) #Data Pipeline #Big Data #Security #Data Modeling #SQL (Structured Query Language) #Visualization
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
ONLY W2 CANDIDATES ARE ACCEPTED FOR THIS OPPORTUNITY.
Job Title: GCP Data Architect
Duration: Long-term
Location: 100% Remote
Looking for Media Industry experience
Need To Have Google Cloud Architect Certification
Top Skills Details:
β’ Expertise in SQL, Python, ERD, GCP(All services, especially BigQuery, GCS, Cloud Function, Composer), DBT with active hands-on experience in the last 3~5 years
Key Responsibilities and Skills:
β’ Possess the knowledge of Modern Data Technology released in the last 2~3 years
β’ Design and optimize conceptual and logical database models
β’ Analyze system requirements, implementing data strategies, and ensuring the efficiency and security
β’ Improve system performance by conducting tests, troubleshooting, and integrating new elements
β’ In-depth understanding of database structure principles
β’ Expertise in implementing and maintaining data pipelines
β’ Deep knowledge of data mining and segmentation techniques
β’ 10+ years in data modeling with 3+ years in Big data (100TB+)
β’ Familiarity with data visualization tools