

Analytical Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytical Engineer with a 6-month contract, offering competitive pay. Requires 10+ years in Data Architecture, advanced Python and SQL skills, and deep GCP expertise. Experience with DBT and ERD architecture is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
August 7, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #SQL (Structured Query Language) #Data Modeling #BigQuery #Cloud #Storage #IAM (Identity and Access Management) #ML (Machine Learning) #Google Cloud Storage #dbt (data build tool) #Data Mining #Python #Data Architecture #Visualization #Data Pipeline #AI (Artificial Intelligence) #Data Engineering #"ETL (Extract #Transform #Load)" #Scala #Big Data #CRM (Customer Relationship Management)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Must-Have Qualifications:
β’ 10+ years in a Data Architect/Engineering role, with a strong focus on data modeling in Big Data environments (100TB+).
β’ Advanced proficiency in Python and SQL for data mining, pipeline development, segmentation, and orchestration.
β’ Deep experience with GCP, including BigQuery, GCS Buckets, IAM, Google Cloud Storage (GCS), Cloud Function, Composer, and VertexAI.
β’ Proven ability to analyze and troubleshoot complex data pipelines through root-cause analysis.
β’ Hands-on experience using DBT for data modeling and transformation workflows.
β’ Strong understanding and application of ERD architecture in enterprise data systems.
Plusses:
β’ AI Agent, ADK, Mult-Agent, Gen AI
Day-to-Day Responsibilities:
Our leading entertainment client is expanding their data team and seeking a Senior Data Engineer with deep expertise in GCP, Big Data, Python, SQL, DBT, and ERD architecture. This cross-functional team operates across three core areas: AI/ML, Data Engineering, and Data Analytics.
β’ Collaborate with stakeholders across subscription, viewership, marketing, paid media, CRM, and more.
β’ Support data visualization initiatives and ensure data is accessible and actionable.
β’ Work in a highly collaborative environment that values fast learners and team players.
β’ Spend approximately 80% of your time in hands-on development using Python and SQL.
Team Dynamic:
Data Analytics Engineering (Approx. 80% of time)
β’ Primary Focus: Analyze and maintain data pipelines, conduct root-cause analysis to resolve data-related issues.
β’ Tools & Skills:
β’ Proficient in Python and SQL for pipeline development.
β’ Initially focused on learning and maintaining existing pipelines, with a gradual transition to building new ones.
β’ Mindset: Business-oriented with a strong analytical approach.
AI/ML Initiatives (Approx. 5% of time)
β’ Objective: Support AI and machine learning efforts using modern platforms such as Vertex AI.
β’ Scope: Limited involvement, primarily assisting with existing initiatives and tools.
Data Engineering (Approx. 15% of time)
β’ Responsibilities: Design and implement scalable data models, pipelines, and architecture.
β’ Technical Stack: Heavy use of Python and SQL to ensure robust and efficient data infrastructure.