GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer based in Dallas, TX or Tampa, FL, requiring 8+ years of experience, including 2+ years in GCP. Key skills include BigQuery, Dataflow, SQL, Python/Scala, and ETL/ELT pipeline development.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 4, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Cloud #"ETL (Extract #Transform #Load)" #GIT #IAM (Identity and Access Management) #Scala #Storage #DevOps #Migration #BigQuery #Dataflow #Python #SQL (Structured Query Language) #Data Engineering #Data Processing #Teradata #GCP (Google Cloud Platform)
Role description
Role : GCP data Engineer Location : Dallas, TX / Tampa, FL All visa except opt, cpt 8+ years exp Required Skills & Experience β€’ 8+ years of experience as a Data Engineer with 2+ years in GCP. β€’ Strong hands-on expertise with BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Cloud Composer, IAM. β€’ Proven experience in Teradata to GCP migration projects. β€’ Proficiency in SQL (Teradata & BigQuery dialects), Python/Scala for data processing. β€’ Experience with ETL/ELT pipeline development and orchestration. β€’ Strong understanding of data warehousing concepts, performance tuning, and query optimization. β€’ Familiarity with CI/CD, DevOps practices, and Git-based workflows.