

Ampstek
GCP Data Engineer (Only W2)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer (W2) with 10+ years of experience, focusing on GCP BigQuery, Python, and SQL. Remote work is available, and expertise in data warehousing, ETL/ELT pipelines, and data governance is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Transformations #Dataflow #"ETL (Extract #Transform #Load)" #Data Governance #Python #Teradata #Storage #Automation #BigQuery #GCP (Google Cloud Platform) #Data Engineering #Data Modeling #Cloud #SQL (Structured Query Language) #Airflow #Scripting
Role description
Job Title: GCP Data Engineer
Job Location: USA (Remote)
Minimum years of experience: 10+ years
Must Have Skills:
GCP BQ
Python
SQL
Nice to Have Skills:
Dataflow
Airflow
Detailed Job Description:
10+ years of experience in Data Engineering with strong expertise in designing, building, and optimizing large-scale data solutions.
Technical Skills:
Data Warehousing & Cloud Platforms:
Proficient in Teradata, and Google Cloud Platform (GCP) services including BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
Expert in SQL for complex query development and performance tuning.
Strong experience in Python scripting for automation, ETL workflows, and data transformations.
Thorough understanding of data modeling, ETL/ELT pipelines, data governance, and performance optimization.
Kindly Share the Resume To: tonny@ampstek.com
Job Title: GCP Data Engineer
Job Location: USA (Remote)
Minimum years of experience: 10+ years
Must Have Skills:
GCP BQ
Python
SQL
Nice to Have Skills:
Dataflow
Airflow
Detailed Job Description:
10+ years of experience in Data Engineering with strong expertise in designing, building, and optimizing large-scale data solutions.
Technical Skills:
Data Warehousing & Cloud Platforms:
Proficient in Teradata, and Google Cloud Platform (GCP) services including BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
Expert in SQL for complex query development and performance tuning.
Strong experience in Python scripting for automation, ETL workflows, and data transformations.
Thorough understanding of data modeling, ETL/ELT pipelines, data governance, and performance optimization.
Kindly Share the Resume To: tonny@ampstek.com






