

Sibitalent Corp
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-12 month contract in Hartford, CT, requiring 4-6+ years of Data Engineering experience, expertise in GCP and Python, and familiarity with Teradata. Healthcare experience is a plus.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 7, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Fixed Term
-
π - Security
Unknown
-
π - Location detailed
Hartford, CT
-
π§ - Skills detailed
#Data Quality #Cloud #BTEQ #GCP (Google Cloud Platform) #Compliance #Logging #Python #Java #Data Architecture #AI (Artificial Intelligence) #Scripting #Airflow #Teradata #Monitoring #SQL (Structured Query Language) #BigQuery #Data Migration #Data Storage #"ETL (Extract #Transform #Load)" #Apache Kafka #ML (Machine Learning) #GIT #DevOps #Data Governance #Security #Dataflow #Data Engineering #Migration #Argo #Data Pipeline #Storage #Data Warehouse #Kafka (Apache Kafka)
Role description
Role - GCP Data Engineers
Location - At least 2x/week on-site in Hartford, CT-151 Farmington Avenue, Hartford, CT 06156- Locals Only
Interview Process: One + done - 1 hour, 100% technical interview, will be asked to share screen and code!
NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.
Duration: 6-12 mo. contract
Top 3 must-have technologies:
β’ GCP
β’ PythonSQL
Nice-to-Haves:
β’ Teradata (platform they are moving off of)
β’ Healthcare experience
Will be looking for:
β’ Candidates who are well-versed + hands-on with required technologies
β’ Candidates who are self-sufficient and can be independent
Position Detail:
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.
Key Responsibilities:
β’ Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
β’ Analyze and map existing Teradata workloads to appropriate GCP equivalents.
β’ Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
β’ Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
β’ Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
β’ Optimize data storage, query performance, and costs in the cloud environment.
β’ Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
β’ 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
β’ Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
β’ Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
β’ Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
β’ Proven ability to refactor and translate legacy logic from Teradata to GCP.
β’ Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
β’ Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
β’ GCP certification (Preferred: Professional Data Engineer).
β’ Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
β’ Experience working in the healthcare domain.
β’ Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:
β’ Problem solving mindset
β’ Attention to detail
β’ Accountability and ownership
β’ Curious and staying current with evolving GCP services
Preference:
Ability to work in Hartford, CT office at least thrice a week.
Thanks and Regardsπ
Nikhil
Technical Recruiter
Email : Nikhil@sibitalent.com
Web: www.sibitalent.com
101, E, Park Blvd.-Suite 600, Plano, TX 75074, USA
Destiny hears β When you speak louder
Note: SibiTalent Corp. is an equal opportunity staffing firm. We do not discriminate on the basis of race, caste, color, religion, gender, culture, visa status, or any other protected characteristic. All hiring decisions are made strictly based on qualifications, experience, and specific client requirements.
Role - GCP Data Engineers
Location - At least 2x/week on-site in Hartford, CT-151 Farmington Avenue, Hartford, CT 06156- Locals Only
Interview Process: One + done - 1 hour, 100% technical interview, will be asked to share screen and code!
NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.
Duration: 6-12 mo. contract
Top 3 must-have technologies:
β’ GCP
β’ PythonSQL
Nice-to-Haves:
β’ Teradata (platform they are moving off of)
β’ Healthcare experience
Will be looking for:
β’ Candidates who are well-versed + hands-on with required technologies
β’ Candidates who are self-sufficient and can be independent
Position Detail:
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.
Key Responsibilities:
β’ Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
β’ Analyze and map existing Teradata workloads to appropriate GCP equivalents.
β’ Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
β’ Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
β’ Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
β’ Optimize data storage, query performance, and costs in the cloud environment.
β’ Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
β’ 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
β’ Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
β’ Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
β’ Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
β’ Proven ability to refactor and translate legacy logic from Teradata to GCP.
β’ Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
β’ Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
β’ GCP certification (Preferred: Professional Data Engineer).
β’ Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
β’ Experience working in the healthcare domain.
β’ Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:
β’ Problem solving mindset
β’ Attention to detail
β’ Accountability and ownership
β’ Curious and staying current with evolving GCP services
Preference:
Ability to work in Hartford, CT office at least thrice a week.
Thanks and Regardsπ
Nikhil
Technical Recruiter
Email : Nikhil@sibitalent.com
Web: www.sibitalent.com
101, E, Park Blvd.-Suite 600, Plano, TX 75074, USA
Destiny hears β When you speak louder
Note: SibiTalent Corp. is an equal opportunity staffing firm. We do not discriminate on the basis of race, caste, color, religion, gender, culture, visa status, or any other protected characteristic. All hiring decisions are made strictly based on qualifications, experience, and specific client requirements.






