Jobs via Dice

Google Cloud Platform Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Platform Data Engineer in Hartford, CT (Hybrid). Contract length is long-term. Requires 12+ years in Data Engineering, 2+ years in GCP, and strong skills in Teradata, SQL, ETL/ELT, and Python.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 30, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Hartford, CT
-
🧠 - Skills detailed
#Cloud #Migration #Python #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Big Data #Argo #Scripting #Storage #Monitoring #DevOps #BTEQ #Data Engineering #GIT #Logging #Airflow #Compliance #Dataflow #Data Quality #Teradata #Google Cloud Dataproc #BigQuery #Data Architecture #Data Storage #Java #SQL (Structured Query Language)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Lorven Technologies, Inc., is seeking the following. Apply via Dice today! Hi , Our client is looking Google Cloud Platform Data Engineer For Contract role in Hartford, CT (Hybrid) below is the detailed requirements. Kindly share your Updated Resume to proceed further. Job Title : Google Cloud Platform Data Engineer Location : Hartford, CT (Hybrid) 3 days Onsite 2 days Remote Duration : Long term Job Description: Required Qualifications • Data Engineer 12+ years of experience in Data Engineering, with at least 2 years in Google Cloud Platform. • Strong hands-on Experience in Teradata data warehousing, BTEQ, and complex SQL. • Solid knowledge of Google Cloud Platform services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc. • Experience with ETL/ELT pipelines using custom scripting tools (Python/Java). • Proven ability to refactor and translate legacy logic from Teradata to Google Cloud Platform. • Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data envi 3. • Experience ,Teradata,Python,Big Data, Google Big Query, Google Cloud DataProc, Google Cloud SQL, Google Cloud Composer, Google Cloud Pub/Sub Responsibilities: • Lead and execute migration of data and ETL workflows from Teradata to Google Cloud Platform-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow). • Analyze and map existing Teradata workloads to appropriate Google Cloud Platform equivalents. • Rewrite SQL logic, scripts, and procedures in Google Cloud Platform-compliant formats (e.g., standard SQL for BigQuery). • Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance. • Develop automated workflows for data movement and transformation using Google Cloud Platform-native tools and/or custom scripts (Python). • Optimize data storage, query performance, and costs in the cloud environment. Implement monitoring, logging, and alerting for all migration pipelines and production workloads