

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a long-term contract (3 to 6 months) working remotely in the US (PT hours). Key skills include GCP, Big Query, dbt Core, data modeling, and ETL pipelines.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 17, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#IP (Internet Protocol) #dbt (data build tool) #GA4 (Google Analytics 4) #GCP (Google Cloud Platform) #Data Modeling #BI (Business Intelligence) #Datasets #Data Engineering #IAM (Identity and Access Management) #Logging #Cloud #Looker #Storage #Data Layers #"ETL (Extract #Transform #Load)"
Role description
Title: Data Engineer
Location: Remote β Anywhere in the US (must work PT hours)
Contract: Long Term Contract
Project Scope (Phases 0β1):
Phase 0 β Foundations (Month 0β1):
β’ GCP project setup, IAM, and audit logging.
β’ Object storage (GCS) buckets with Raw β Clean β Gold layers + retention rules.
β’ Big Query datasets aligned with data layers.
β’ dbt Core scaffolding (open source, no vendor lock-in).
β’ Static IP setup for SFTP partner access.
Phase 1 β Early Value (Month 1β2):
β’ GA4/Firebase native export into Big Query.
β’ Partner SFTP ingestion β GCS β Big Query.
β’ dbt models for KPIs (active users, deposits, offers, conversions).
β’ Insight dashboards with Looker Studio (preferred, not mandatory).
Key Skills:
β’ Strong experience with Google Cloud Platform (GCP), Big Query, and dbt Core.
β’ Hands-on expertise in data modeling, ETL pipelines, and orchestration.
β’ Familiarity with SFTP integrations and secure data transfer.
β’ Knowledge of Looker Studio or other BI tools is a plus.
Title: Data Engineer
Location: Remote β Anywhere in the US (must work PT hours)
Contract: Long Term Contract
Project Scope (Phases 0β1):
Phase 0 β Foundations (Month 0β1):
β’ GCP project setup, IAM, and audit logging.
β’ Object storage (GCS) buckets with Raw β Clean β Gold layers + retention rules.
β’ Big Query datasets aligned with data layers.
β’ dbt Core scaffolding (open source, no vendor lock-in).
β’ Static IP setup for SFTP partner access.
Phase 1 β Early Value (Month 1β2):
β’ GA4/Firebase native export into Big Query.
β’ Partner SFTP ingestion β GCS β Big Query.
β’ dbt models for KPIs (active users, deposits, offers, conversions).
β’ Insight dashboards with Looker Studio (preferred, not mandatory).
Key Skills:
β’ Strong experience with Google Cloud Platform (GCP), Big Query, and dbt Core.
β’ Hands-on expertise in data modeling, ETL pipelines, and orchestration.
β’ Familiarity with SFTP integrations and secure data transfer.
β’ Knowledge of Looker Studio or other BI tools is a plus.