Jobs via Dice

Google Cloud Platform Data Engineer - W2 - Remote - Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Platform Data Engineer on a long-term remote contract, requiring 3+ years of experience in GCP services, Python, SQL, Apache Kafka, and NoSQL databases. U.S. Citizenship or Green Card is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Scripting #Data Processing #Linux #Schema Design #Apache Airflow #Scala #Programming #Data Engineering #Kafka (Apache Kafka) #Automation #Data Pipeline #SQL (Structured Query Language) #Apache Kafka #API (Application Programming Interface) #Shell Scripting #NoSQL #Data Quality #MongoDB #SQL Queries #Python #Storage #Cloud #GCP (Google Cloud Platform) #Data Profiling #Databases #BigQuery
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, TechVirtue LLC, is seeking the following. Apply via Dice today! Job Title: Google Cloud Platform Data Engineer Location: REMOTE Duration: Long Term USC/ GC. Required Minimum 3 years of proven hands-on experience in the following: Design and implement robust data pipelines using Google Cloud Platform (Google Cloud Platform) services such as BigQuery, Cloud Storage, and Pub/Sub. Develop and manage workflows using Cloud Composer (Apache Airflow) for efficient scheduling and orchestration. Write clean, efficient, and scalable code in Python, leveraging advanced programming techniques. Craft complex SQL queries in BigQuery, including window functions, CTEs, and performance tuning strategies. Build and maintain real-time data processing systems using Apache Kafka. Model and manage NoSQL databases, particularly MongoDB, with a focus on scalable schema design. Utilize Shell scripting and perform Linux system administration tasks to support data infrastructure. Conduct data profiling and implement validation techniques to ensure data quality and integrity. Develop and maintain API integration scripts for seamless service automation and data exchange. Troubleshoot and resolve data-related issues with strong analytical and problem-solving skills. Create and maintain data flow diagrams to clearly communicate architecture and pipeline logic to stakeholders.