Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a long-term contract (3 to 6 months) working remotely in the US (PT hours). Key skills include GCP, Big Query, dbt Core, data modeling, and ETL pipelines.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 17, 2025
πŸ•’ - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#IP (Internet Protocol) #dbt (data build tool) #GA4 (Google Analytics 4) #GCP (Google Cloud Platform) #Data Modeling #BI (Business Intelligence) #Datasets #Data Engineering #IAM (Identity and Access Management) #Logging #Cloud #Looker #Storage #Data Layers #"ETL (Extract #Transform #Load)"
Role description
Title: Data Engineer Location: Remote – Anywhere in the US (must work PT hours) Contract: Long Term Contract Project Scope (Phases 0–1): Phase 0 – Foundations (Month 0–1): β€’ GCP project setup, IAM, and audit logging. β€’ Object storage (GCS) buckets with Raw β†’ Clean β†’ Gold layers + retention rules. β€’ Big Query datasets aligned with data layers. β€’ dbt Core scaffolding (open source, no vendor lock-in). β€’ Static IP setup for SFTP partner access. Phase 1 – Early Value (Month 1–2): β€’ GA4/Firebase native export into Big Query. β€’ Partner SFTP ingestion β†’ GCS β†’ Big Query. β€’ dbt models for KPIs (active users, deposits, offers, conversions). β€’ Insight dashboards with Looker Studio (preferred, not mandatory). Key Skills: β€’ Strong experience with Google Cloud Platform (GCP), Big Query, and dbt Core. β€’ Hands-on expertise in data modeling, ETL pipelines, and orchestration. β€’ Familiarity with SFTP integrations and secure data transfer. β€’ Knowledge of Looker Studio or other BI tools is a plus.