

Lorven Technologies Inc.
GCP Data Engineer (SAS to GCP Migration)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer focused on SAS to GCP migration, located in Dallas, TX. It is a long-term contract position requiring SQL expertise, Python and PySpark development skills, and hands-on experience with GCP and BigQuery.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Macros #SQL (Structured Query Language) #Teradata SQL #Data Engineering #Airflow #BigQuery #Teradata #PySpark #SAS #Storage #GCP (Google Cloud Platform) #Data Processing #Scripting #Migration #Cloud #Scala #Data Manipulation #AI (Artificial Intelligence) #Python #Spark (Apache Spark)
Role description
Job Title : GCP Data Engineer (SAS to GCP Migration)
Location : Dallas TX | Hybrid
Duration : Long Term Contract
Job Description:
SQL Expertise (Teradata → BigQuery)
• Deep understanding of Teradata SQL dialect
• Ability to rewrite queries in BigQuery Standard SQL
• Knowledge of query optimization, partitioning, and cost control
Python & PySpark Development
• Proficiency in Python for data manipulation and scripting
• Experience with PySpark for scalable, distributed data processing
• Familiarity with UDFs, dataframes, and pipeline orchestration
GCP & BigQuery Engineering
• Hands-on experience with BigQuery, Cloud Storage, Cloud Composer (Airflow)
• Ability to build and manage ETL pipelines in GCP
SAS & Teradata Application Analysis
• Ability to interpret SAS macros, DATA steps, and PROC SQL
Prompt Engineering & AI Tooling
• Crafting effective prompts for AI-assisted code translation tools
Job Title : GCP Data Engineer (SAS to GCP Migration)
Location : Dallas TX | Hybrid
Duration : Long Term Contract
Job Description:
SQL Expertise (Teradata → BigQuery)
• Deep understanding of Teradata SQL dialect
• Ability to rewrite queries in BigQuery Standard SQL
• Knowledge of query optimization, partitioning, and cost control
Python & PySpark Development
• Proficiency in Python for data manipulation and scripting
• Experience with PySpark for scalable, distributed data processing
• Familiarity with UDFs, dataframes, and pipeline orchestration
GCP & BigQuery Engineering
• Hands-on experience with BigQuery, Cloud Storage, Cloud Composer (Airflow)
• Ability to build and manage ETL pipelines in GCP
SAS & Teradata Application Analysis
• Ability to interpret SAS macros, DATA steps, and PROC SQL
Prompt Engineering & AI Tooling
• Crafting effective prompts for AI-assisted code translation tools





