

Collabera
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "unknown." The pay rate is "$65-$75/hr." Key requirements include 7+ years in software/data engineering, extensive GCP data services experience, and strong PySpark/Spark skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
January 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Las Palmas 2, TX
-
🧠 - Skills detailed
#Programming #Python #SQL (Structured Query Language) #Cloud #Storage #GCP (Google Cloud Platform) #Looker #BigQuery #Data Lake #Airflow #Monitoring #Spark (Apache Spark) #Hadoop #Data Engineering #PySpark #DevOps
Role description
• 7+ years of senior senior-level software or data engineering experience
• 7+ years python programming experience including python frameworks and integrations
• 5+ years of hands-on experience with GCP data services like Cloud Storage, Dataproc, BigQuery, Composer, Looker and GCP SQL
• Proven experience building data platforms and pipelines on GCP
• Strong PySpark/Spark experience for data engineering and delivery
• Extensive experience within data lake/lakehouses
• Experience migrating on-prem Hadoop/Airflow to GCP
• Hands-on Data DevOps experience(CI/CD, monitoring, operations)
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time , paid sick and safe time , hours of paid vacation time, weeks of paid parental leave, paid holidays annually - AS Applicable)
Pay Range: $65-$75/hr
• 7+ years of senior senior-level software or data engineering experience
• 7+ years python programming experience including python frameworks and integrations
• 5+ years of hands-on experience with GCP data services like Cloud Storage, Dataproc, BigQuery, Composer, Looker and GCP SQL
• Proven experience building data platforms and pipelines on GCP
• Strong PySpark/Spark experience for data engineering and delivery
• Extensive experience within data lake/lakehouses
• Experience migrating on-prem Hadoop/Airflow to GCP
• Hands-on Data DevOps experience(CI/CD, monitoring, operations)
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time , paid sick and safe time , hours of paid vacation time, weeks of paid parental leave, paid holidays annually - AS Applicable)
Pay Range: $65-$75/hr






