

HatchPros
Data Engineer (Teradata and GCP)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Teradata to GCP migration, offering a 3-month contract with a pay rate of "TBD." Preferred hybrid work in Phoenix, AZ, or remote in MST. Requires 7+ years of experience, strong SQL skills, and GCP knowledge.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
January 15, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Airflow #"ETL (Extract #Transform #Load)" #BigQuery #Apache Kafka #Data Engineering #Migration #Dataflow #GCP (Google Cloud Platform) #Kafka (Apache Kafka) #Storage #Teradata #BTEQ #AI (Artificial Intelligence) #ML (Machine Learning) #Cloud #SQL (Structured Query Language)
Role description
USC and GC
need LinkediIn URL
3 month initial contract
preffered Hybrid onsite in Phoenix, AZ or Remote working in MST
Only candidates coming from a Teradata to GCP Migration project currently will be considered.
Key skills: Teradata, GCP, SQL, Google certification strongly preferred
Required Skills
• 7+ years of experience in Data Engineering, with at least 2 years in GCP.
• Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
• Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
• Proven ability to refactor and translate legacy logic from Teradata to GCP.
Preferred Qualifications
• GCP certification (Preferred: Professional Data Engineer).
• Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP
Key Responsibilities
• Execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
• Analyze and map existing Teradata workloads to appropriate GCP equivalents.
• Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
USC and GC
need LinkediIn URL
3 month initial contract
preffered Hybrid onsite in Phoenix, AZ or Remote working in MST
Only candidates coming from a Teradata to GCP Migration project currently will be considered.
Key skills: Teradata, GCP, SQL, Google certification strongly preferred
Required Skills
• 7+ years of experience in Data Engineering, with at least 2 years in GCP.
• Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
• Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
• Proven ability to refactor and translate legacy logic from Teradata to GCP.
Preferred Qualifications
• GCP certification (Preferred: Professional Data Engineer).
• Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP
Key Responsibilities
• Execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
• Analyze and map existing Teradata workloads to appropriate GCP equivalents.
• Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).






