

Sibitalent Corp
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Hartford, Connecticut, offering a 6-12 month contract at a competitive pay rate. Required skills include GCP, Python, and SQL, with healthcare experience preferred. A manager reference is necessary.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 7, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Hartford County, CT
-
π§ - Skills detailed
#Data Quality #Cloud #BTEQ #GCP (Google Cloud Platform) #Compliance #Logging #Python #Java #Data Architecture #AI (Artificial Intelligence) #Scripting #Airflow #Teradata #Monitoring #SQL (Structured Query Language) #BigQuery #Data Migration #Data Storage #"ETL (Extract #Transform #Load)" #Apache Kafka #ML (Machine Learning) #GIT #DevOps #Data Governance #Security #Dataflow #Data Engineering #Migration #Argo #Data Pipeline #Storage #Data Warehouse #Kafka (Apache Kafka)
Role description
Hi,
Hope you are doing well,
IMMEDIATE INTERVIEW(Interview Process: One + done) = GCP Data Engineers in Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)- OPEN FOR W2 CANDIDATE
Please find the Job details below and kindly revert if youβre interested in learning more about this job.
Job Title: GCP Data Engineers
Location: Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)
Duration: 6-12 mo. contract
NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.
Other Acceptable Titles: GCP Engineer, Data Engineer, Cloud Engineer
Top 3 must-have technologies:
β’ GCP
β’ Python
β’ SQL
Nice-to-Haves:
β’ Teradata (platform they are moving off of)
β’ Healthcare experience
Will be looking for:
β’ Candidates who are well-versed + hands-on with required technologies
β’ Candidates who are self-sufficient and can be independent
Position Detail:
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.
Key Responsibilities:
β’ Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
β’ Analyze and map existing Teradata workloads to appropriate GCP equivalents.
β’ Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
β’ Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
β’ Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
β’ Optimize data storage, query performance, and costs in the cloud environment.
β’ Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
β’ 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
β’ Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
β’ Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
β’ Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
β’ Proven ability to refactor and translate legacy logic from Teradata to GCP.
β’ Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
β’ Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
β’ GCP certification (Preferred: Professional Data Engineer).
β’ Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
β’ Experience working in the healthcare domain.
β’ Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:
β’ Problem solving mindset
β’ Attention to detail
β’ Accountability and ownership
β’ Curious and staying current with evolving GCP services
Preference:
Ability to work in Hartford, CT office at least thrice a week.
Hi,
Hope you are doing well,
IMMEDIATE INTERVIEW(Interview Process: One + done) = GCP Data Engineers in Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)- OPEN FOR W2 CANDIDATE
Please find the Job details below and kindly revert if youβre interested in learning more about this job.
Job Title: GCP Data Engineers
Location: Hartford, Connecticut -HYBRID (NEED LOCAL CANDIDATE)
Duration: 6-12 mo. contract
NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.
Other Acceptable Titles: GCP Engineer, Data Engineer, Cloud Engineer
Top 3 must-have technologies:
β’ GCP
β’ Python
β’ SQL
Nice-to-Haves:
β’ Teradata (platform they are moving off of)
β’ Healthcare experience
Will be looking for:
β’ Candidates who are well-versed + hands-on with required technologies
β’ Candidates who are self-sufficient and can be independent
Position Detail:
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.
Key Responsibilities:
β’ Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
β’ Analyze and map existing Teradata workloads to appropriate GCP equivalents.
β’ Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
β’ Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
β’ Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
β’ Optimize data storage, query performance, and costs in the cloud environment.
β’ Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
β’ 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
β’ Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
β’ Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
β’ Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
β’ Proven ability to refactor and translate legacy logic from Teradata to GCP.
β’ Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
β’ Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
β’ GCP certification (Preferred: Professional Data Engineer).
β’ Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
β’ Experience working in the healthcare domain.
β’ Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:
β’ Problem solving mindset
β’ Attention to detail
β’ Accountability and ownership
β’ Curious and staying current with evolving GCP services
Preference:
Ability to work in Hartford, CT office at least thrice a week.






