

Jobs via Dice
Data Engineer (Google Cloud Platform)_ W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Google Cloud Platform) on a 6-month contract, remote with PST/EST overlap. Requires 7+ years of experience, strong skills in PySpark, Databricks, Python, and GCP tools, with a focus on data ingestion and modeling.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 27, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#dbt (data build tool) #JSON (JavaScript Object Notation) #Consulting #PySpark #Storage #Data Modeling #Databricks #Spark (Apache Spark) #GCP (Google Cloud Platform) #XML (eXtensible Markup Language) #Data Engineering #Datasets #Python #Airflow #Data Ingestion #Cloud #BigQuery
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Learn Beyond Consulting LLC, is seeking the following. Apply via Dice today!
Data Engineer (Google Cloud Platform) (Contract) Multiple Openings
Type: Contract (6 months, likely extension, long term)
Location: Remote (PST/EST overlap required)
Technical Requirements
Must-Have
• PySpark & Databricks: Strong hands-on experience building and maintaining production pipelines. Experience with Unity Catalog is a plus.
• Python Engineering: Primary development language with production-grade practices (typed, tested, modular code not notebook-only development).
• Google Cloud Platform Ecosystem: Proven experience with BigQuery, Dataproc, Airflow/Cloud Composer, Pub/Sub, and Cloud Storage (Parquet).
• Data Ingestion: Experience working with complex, multi-source datasets across varying schemas and formats (e.g., JSON, CSV, XML, custom feeds).
• Data Modeling: Strong understanding of staging and curated layer design, partitioning strategies, and schema evolution across distributed data sources.
• Experience Level: 4 7+ years of hands-on data engineering experience with a track record of building and maintaining production systems (not research-focused profiles).
Nice to Have
• dbt: Experience working with dbt, ideally within a medallion/lakehouse architecture.
• Entity Resolution: Familiarity with record linkage techniques (fuzzy matching, phonetic similarity, Python-based frameworks).
• Domain Experience: Exposure to music, royalties, or media rights data is a plus.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Learn Beyond Consulting LLC, is seeking the following. Apply via Dice today!
Data Engineer (Google Cloud Platform) (Contract) Multiple Openings
Type: Contract (6 months, likely extension, long term)
Location: Remote (PST/EST overlap required)
Technical Requirements
Must-Have
• PySpark & Databricks: Strong hands-on experience building and maintaining production pipelines. Experience with Unity Catalog is a plus.
• Python Engineering: Primary development language with production-grade practices (typed, tested, modular code not notebook-only development).
• Google Cloud Platform Ecosystem: Proven experience with BigQuery, Dataproc, Airflow/Cloud Composer, Pub/Sub, and Cloud Storage (Parquet).
• Data Ingestion: Experience working with complex, multi-source datasets across varying schemas and formats (e.g., JSON, CSV, XML, custom feeds).
• Data Modeling: Strong understanding of staging and curated layer design, partitioning strategies, and schema evolution across distributed data sources.
• Experience Level: 4 7+ years of hands-on data engineering experience with a track record of building and maintaining production systems (not research-focused profiles).
Nice to Have
• dbt: Experience working with dbt, ideally within a medallion/lakehouse architecture.
• Entity Resolution: Familiarity with record linkage techniques (fuzzy matching, phonetic similarity, Python-based frameworks).
• Domain Experience: Exposure to music, royalties, or media rights data is a plus.





