

DBT - Data Engineer (GCP)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a DBT - Data Engineer (GCP) on a contract basis, located in New York, Texas, or New Jersey (Hybrid). Requires strong dbt, BigQuery, SQL skills, data migration experience, and familiarity with GCP services and Agile methodologies.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 4, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, United States
-
π§ - Skills detailed
#Version Control #Python #BigQuery #Dataflow #Migration #SQL (Structured Query Language) #Scrum #Storage #Automation #Bash #GitHub #Agile #Teradata #GCP (Google Cloud Platform) #BTEQ #Data Engineering #Scripting #Data Migration #dbt (data build tool) #Airflow #Cloud #"ETL (Extract #Transform #Load)" #EDW (Enterprise Data Warehouse) #Data Pipeline #Data Transformations
Role description
Role: DBT - Senior Data Engineer - GCP
Location: New York / Texas / New Jersery (Hybrid - Onsite)
Duration: Contract
Job Description:
We are looking for a GCP Data Engineer to join our EDW modernization project. Focusing on dbt, BigQuery, and Data Warehousing.The role involves building and maintaining data pipelines in BigQuery and supporting migration from legacy systems like Teradata.
Qualifications:
β’ Strong Experience with dbt (Data Build Tool).
β’ Expertise in BigQuery and Data Warehousing concepts.
β’ Experience in data migration
β’ Strong SQL skills with the ability to build complex data transformations in BigQuery.
β’ Knowledge of Teradata and ability to understand BTEQ scripts.
β’ Expertise with GCP services: BigQuery, Cloud Storage (GCS), Dataproc, Pub/Sub, Dataflow, Data Fusion, Cloud Functions.
β’ Experience with job orchestration tools like Airflow.
β’ Good skills in Python and Bash scripting for automation.
β’ Familiar with version control tools (e.g., GitHub).
β’ Experience working in Agile/Scrum teams.
β’ Strong problem-solving, analytical, and communication skills.
β’ Ability to work in Onsite/Offshore model and lead a small team.
Please share your resume to nikita.kongari@aptino.com
Role: DBT - Senior Data Engineer - GCP
Location: New York / Texas / New Jersery (Hybrid - Onsite)
Duration: Contract
Job Description:
We are looking for a GCP Data Engineer to join our EDW modernization project. Focusing on dbt, BigQuery, and Data Warehousing.The role involves building and maintaining data pipelines in BigQuery and supporting migration from legacy systems like Teradata.
Qualifications:
β’ Strong Experience with dbt (Data Build Tool).
β’ Expertise in BigQuery and Data Warehousing concepts.
β’ Experience in data migration
β’ Strong SQL skills with the ability to build complex data transformations in BigQuery.
β’ Knowledge of Teradata and ability to understand BTEQ scripts.
β’ Expertise with GCP services: BigQuery, Cloud Storage (GCS), Dataproc, Pub/Sub, Dataflow, Data Fusion, Cloud Functions.
β’ Experience with job orchestration tools like Airflow.
β’ Good skills in Python and Bash scripting for automation.
β’ Familiar with version control tools (e.g., GitHub).
β’ Experience working in Agile/Scrum teams.
β’ Strong problem-solving, analytical, and communication skills.
β’ Ability to work in Onsite/Offshore model and lead a small team.
Please share your resume to nikita.kongari@aptino.com