

ConglomerateIT
GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a 12+ month contract in Hartford, CT (hybrid). Requires 7+ years in IT, 4+ years in GCP data pipelines, proficiency in Python, SQL, and experience in Banking & Financial Services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 11, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
1099 Contractor
-
π - Security
Unknown
-
π - Location detailed
Hartford, CT
-
π§ - Skills detailed
#Python #Scripting #Dataflow #Batch #BigQuery #Unix #Data Processing #PySpark #Computer Science #GitHub #Spark (Apache Spark) #Data Engineering #Scala #Datasets #GCP (Google Cloud Platform) #Code Reviews #Jenkins #SQL (Structured Query Language) #Data Pipeline #Kafka (Apache Kafka) #Cloud #Strategy
Role description
Job Title: GCP Data Engineer
Tax Term: W2/1099
Location: Hartford-CT - Hybrid
Employment Type: Contract
Duration: 12+
About Us
ConglomerateIT is a certified and a pioneer in providing premium end-to-end Global Workforce Solutions and IT Services to diverse clients across various domains. Visit us at http://www.conglomerateit.com
Our mission is to establish global cross culture human connections that further the careers of our employees and strengthen the businesses of our clients. We are driven to use the power of global network to connect business with the right people without bias. We provide Global Workforce Solutions with affability.
Job Summary:
We are seeking a skilled GCP Data Engineer with expertise in streaming and batch pipelines, large-scale data processing, and real-time decision-making on GCP.
Key Responsibilities:
β’ Develop and implement scalable data pipelines on GCP utilizing BigQuery, Dataproc, Pub/Sub or Kafka, and Dataflow.
β’ Design and build pipelines to process extensive datasets using PySpark.
β’ Possess at least 4+ years of hands-on experience with Python or any scripting language and SQL.
β’ Drive data pipeline strategy, mentor junior engineers, and ensure best practices through code reviews.
β’ Requirements:
β’ Bachelorβs degree in Computer Science, Information Technology, or equivalent professional experience.
β’ Minimum of 7 years in IT, with at least 4 years dedicated to building scalable data pipelines on GCP.
β’ Proficiency in scripting languages such as Python or Unix.
β’ Strong analytical and problem-solving skills with keen attention to detail.
β’ Solid experience in the Banking & Financial Services domain.
Preferred Qualifications:
β’ Google Cloud Professional Data Engineer certification.
β’ Experience working in multi-cloud environments.
β’ Familiarity with CI/CD tools like Cloud Build, Jenkins, and GitHub Actions.
Job Title: GCP Data Engineer
Tax Term: W2/1099
Location: Hartford-CT - Hybrid
Employment Type: Contract
Duration: 12+
About Us
ConglomerateIT is a certified and a pioneer in providing premium end-to-end Global Workforce Solutions and IT Services to diverse clients across various domains. Visit us at http://www.conglomerateit.com
Our mission is to establish global cross culture human connections that further the careers of our employees and strengthen the businesses of our clients. We are driven to use the power of global network to connect business with the right people without bias. We provide Global Workforce Solutions with affability.
Job Summary:
We are seeking a skilled GCP Data Engineer with expertise in streaming and batch pipelines, large-scale data processing, and real-time decision-making on GCP.
Key Responsibilities:
β’ Develop and implement scalable data pipelines on GCP utilizing BigQuery, Dataproc, Pub/Sub or Kafka, and Dataflow.
β’ Design and build pipelines to process extensive datasets using PySpark.
β’ Possess at least 4+ years of hands-on experience with Python or any scripting language and SQL.
β’ Drive data pipeline strategy, mentor junior engineers, and ensure best practices through code reviews.
β’ Requirements:
β’ Bachelorβs degree in Computer Science, Information Technology, or equivalent professional experience.
β’ Minimum of 7 years in IT, with at least 4 years dedicated to building scalable data pipelines on GCP.
β’ Proficiency in scripting languages such as Python or Unix.
β’ Strong analytical and problem-solving skills with keen attention to detail.
β’ Solid experience in the Banking & Financial Services domain.
Preferred Qualifications:
β’ Google Cloud Professional Data Engineer certification.
β’ Experience working in multi-cloud environments.
β’ Familiarity with CI/CD tools like Cloud Build, Jenkins, and GitHub Actions.





