

Tekskills Inc.
GCP Data Engineer / Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer/Data Architect in Irving, TX, lasting 12+ months, offering a W2 contract. Key skills include GCP BigQuery, Dataflow, and Teradata. Requires 4-6 years of data architecture and engineering experience, with 2 years in GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 21, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irving, TX
-
🧠 - Skills detailed
#Data Engineering #SQL (Structured Query Language) #BTEQ #Cloud #BigQuery #Dataflow #Data Architecture #Storage #Python #Teradata #GCP (Google Cloud Platform) #Scripting #Airflow
Role description
Need on w2 -GC/USC preffered by client for this requirement
Job Title: GCP Data Engineer / Data Architect
Location Irving, TX 75039 (Onsite)
Duration: 12+ Months
Job Details:
Must Have Skills
• GCP Big Query
• Cloud Storage
• Dataflow, Dataproc, and Composer (Airflow)
Detailed Job Description
• 4 to 6 years of experience in Data architecting, with at least 2 years in GCP.4 to 6 years of experience in Data Engineering, with at least 2 years in GCP.
• Strong hands on experience in Teradata data warehousing, BTEQ, and complex SQL.
• Solid knowledge of GCP services BigQuery, Dataflow, Cloud Storage, PubSub, Composer, and Dataproc.
• Experience with ETLELT pipelines using custom scripting tools PythonJava.
• Proven ability to refactor and translate legacy logic from Teradata to GCP.
• Familiarity with CICD
Need on w2 -GC/USC preffered by client for this requirement
Job Title: GCP Data Engineer / Data Architect
Location Irving, TX 75039 (Onsite)
Duration: 12+ Months
Job Details:
Must Have Skills
• GCP Big Query
• Cloud Storage
• Dataflow, Dataproc, and Composer (Airflow)
Detailed Job Description
• 4 to 6 years of experience in Data architecting, with at least 2 years in GCP.4 to 6 years of experience in Data Engineering, with at least 2 years in GCP.
• Strong hands on experience in Teradata data warehousing, BTEQ, and complex SQL.
• Solid knowledge of GCP services BigQuery, Dataflow, Cloud Storage, PubSub, Composer, and Dataproc.
• Experience with ETLELT pipelines using custom scripting tools PythonJava.
• Proven ability to refactor and translate legacy logic from Teradata to GCP.
• Familiarity with CICD