GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 8+ years of software/data engineering experience, strong GCP and Python expertise, and hands-on experience with BigQuery, Airflow, and Dataflow. It is a 100% remote position with a competitive pay rate.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Data Engineering #Scala #Data Pipeline #Docker #"ETL (Extract #Transform #Load)" #BigQuery #Cloud #GCP (Google Cloud Platform) #Apache Airflow #Pandas #Python #PySpark #Kubernetes #Spark (Apache Spark) #Dataflow
Role description
Senior Software Engineer (GCP Data Engineering) – 100% Remote (US based candidates only) We’re looking for a Google Cloud Platform (GCP) expert with strong Python skills to design, build, and optimize large-scale data pipelines. Tech Stack: βœ… BigQuery | Apache Airflow (DAGs) | Dataflow | Dataproc | Pub/Sub βœ… Python (Pandas, PySpark) | CI/CD | Docker | Kubernetes You will: πŸ”Ή Develop and orchestrate complex data workflows in GCP πŸ”Ή Build scalable, reliable ETL pipelines πŸ”Ή Collaborate with cross-functional teams to ensure timely, high-quality data delivery Must-Have: βœ” 8+ years of software/data engineering experience βœ” Strong GCP & Python expertise βœ” Hands-on experience with BigQuery, Airflow, Dataflow, Dataproc, Pub/Sub