

Google Cloud Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Data Engineer for 12-24+ months, paying competitively. It requires 10+ years of experience, expertise in GCP, BigQuery, ETL tools, Python, and relevant GCP certifications. Location options are Tampa, FL, or Dallas, TX.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 23, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Automation #PySpark #Security #Google Cloud Storage #Teradata #Dataflow #Scripting #Clustering #Apache Beam #Storage #IAM (Identity and Access Management) #BigQuery #Data Engineering #Airflow #Shell Scripting #Cloud #Python #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Data Governance #SQL (Structured Query Language) #Data Processing #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
5 roles to fill - interview asap
GCP Engineer
Location: Tampa FL, Dallas TX
Employment Type: 12-24+ Months
β’ Skillsets: GCP, Pyspark, DAG, Airflow, Python, Teradata (Good to Have)
β’ Certifications: GCP
HYBRID ROLE
Job Summary:
We are looking for a highly skilled GCP Engineer with expertise in BigQuery and other GCP services to design, implement, and optimize data solutions on Google Cloud Platform. The ideal candidate will have strong experience in data engineering, cloud computing, and large-scale data processing.
Required Skills & Experience:
β’ 10+ years of experience working as a GCP Data Engineer or similar role.
β’ Strong expertise in Google BigQuery β performance tuning, partitioning, clustering, and cost optimization.
β’ Hands-on experience with ETL tools and frameworks, including Cloud Dataflow (Apache Beam), Cloud Dataproc (Spark), and Cloud Composer (Airflow).
β’ Proficiency in SQL, Python, and Shell scripting for data transformation and automation.
β’ Experience with Google Cloud Storage, Pub/Sub, and Cloud Functions.
β’ Strong understanding of GCP IAM, security best practices, and data governance.
apply now