TalentBridge

Senior GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Engineer on a contract basis, requiring expertise in data modeling, Python, and GCP services like BigQuery. Key skills include data pipelines, SQL/NoSQL databases, and Apache Airflow. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
February 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Lake #Scala #Data Pipeline #Data Security #Airflow #Data Storage #Data Engineering #Visualization #Storage #BigQuery #Security #NoSQL #Databases #Apache Airflow #GCP (Google Cloud Platform) #Data Modeling #Cloud #SQL (Structured Query Language) #Python
Role description
Job Description: Design, build, and operate scalable data pipelines and manage data storage in distributed systems and production environments. Apply strong data modeling expertise and use modern data engineering tools and platforms to support analytics and applications. Write clean, high-quality code (especially in Python) to develop and deploy large-scale, data-centric solutions. Work across diverse databases and platforms, including SQL/NoSQL systems, data lakes, and GCP services (BigQuery, Cloud SQL, Spanner), plus orchestration tools like Apache Airflow. Translate data into insights using visualization, clear communication, and data-driven approaches while ensuring data security and privacy.