

Insight Global
Sr. GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. GCP Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include GCP, Big Data, Python, SQL, and DBT. Requires 10+ years in Data Architect role and strong data modeling experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
January 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Storage #IAM (Identity and Access Management) #Visualization #Cloud #Data Pipeline #Data Mining #dbt (data build tool) #CRM (Customer Relationship Management) #"ETL (Extract #Transform #Load)" #Big Data #Data Modeling #Python #Google Cloud Storage #AI (Artificial Intelligence) #ML (Machine Learning) #SQL (Structured Query Language) #BigQuery #Data Engineering #GCP (Google Cloud Platform) #Data Architecture
Role description
Job Description:
Our leading entertainment client is expanding their data team and seeking a Senior Data Architect with deep expertise in GCP, Big Data, Python, SQL, DBT, and ERD architecture. This cross-functional team operates across three core areas: AI/ML, Data Engineering, and Data Analytics.
• Collaborate with stakeholders across subscription, viewership, marketing, paid media, CRM, and more.
• Support data visualization initiatives and ensure data is accessible and actionable.
• Work in a highly collaborative environment that values fast learners and team players.
• Spend approximately 80% of your time in hands-on development using Python and SQL.
Required Skills & Experience
• 10+ years in a Data Architect role, with a strong focus on data modeling in Big Data environments (100TB+).
• Advanced proficiency in Python and SQL for data mining, pipeline development, segmentation, and orchestration.
• Deep experience with GCP, including BigQuery, GCS Buckets, IAM, Google Cloud Storage (GCS), Cloud Function, Composer, and VertexAI.
• Proven ability to analyze and troubleshoot complex data pipelines through root-cause analysis.
• Hands-on experience using DBT for data modeling and transformation workflows.
• Strong understanding and application of ERD architecture in enterprise data systems.
Job Description:
Our leading entertainment client is expanding their data team and seeking a Senior Data Architect with deep expertise in GCP, Big Data, Python, SQL, DBT, and ERD architecture. This cross-functional team operates across three core areas: AI/ML, Data Engineering, and Data Analytics.
• Collaborate with stakeholders across subscription, viewership, marketing, paid media, CRM, and more.
• Support data visualization initiatives and ensure data is accessible and actionable.
• Work in a highly collaborative environment that values fast learners and team players.
• Spend approximately 80% of your time in hands-on development using Python and SQL.
Required Skills & Experience
• 10+ years in a Data Architect role, with a strong focus on data modeling in Big Data environments (100TB+).
• Advanced proficiency in Python and SQL for data mining, pipeline development, segmentation, and orchestration.
• Deep experience with GCP, including BigQuery, GCS Buckets, IAM, Google Cloud Storage (GCS), Cloud Function, Composer, and VertexAI.
• Proven ability to analyze and troubleshoot complex data pipelines through root-cause analysis.
• Hands-on experience using DBT for data modeling and transformation workflows.
• Strong understanding and application of ERD architecture in enterprise data systems.






