

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a 6+ month contract, remote work preferred from Cleveland, OH, offering competitive pay. Key skills include GCP services, SQL, data integration, and experience in financial services is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 3, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Brooklyn, OH
-
π§ - Skills detailed
#Oracle #Data Processing #SQL Server #Batch #Leadership #Cloud #Dataflow #Data Pipeline #"ETL (Extract #Transform #Load)" #BigQuery #SQL (Structured Query Language) #GitHub #Agile #Python #Data Integration #Data Quality #Scala #Data Engineering #Data Architecture #dbt (data build tool) #Data Lifecycle #Security #GCP (Google Cloud Platform)
Role description
Title: GCP Data Engineer
Location: Remote (However Prefer someone from Cleveland, OH)
Duration: 6+ Months
Summary
Seeking a Google Cloud Data Engineer to join a talented team to build a new and exciting data product. The ideal candidate will bring a strong technical background, leadership experience, and exceptional communication skills to engage business stakeholders and drive data transformation initiatives in an Agile environment
Responsibilities
Design, develop, and implement scalable data pipelines and associated processes using Google Cloud Platform (GCP) services
Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions
Build and maintain data architecture to support the new data product
Optimize and enhance data processing workflows for performance, reliability, and scalability
Ensure data quality, integrity, and security throughout the data lifecycle
Monitor and troubleshoot data pipelines and systems to ensure smooth operation
Document technical specifications, processes, and workflows for future reference and knowledge sharing
Qualifications
Realtime and batch data integration development experience using GCP services
Experience extracting, transforming, and loading data from various types of sources
Demonstrated ability to lead projects and teams, with experience as a technical lead or similar leadership role
Preferred
β’ financial services industry experience
β’ Technical skills
β’ SQL
β’ Google BigQuery
β’ Google Cloud Data Pipeline Tools (such as Google Cloud Datastream, DataFlow etc.)
β’ Google Cloud Composer
β’ Google Cloud Run
β’ GitHub
β’ Python
β’ Teraform
β’ Preferred
β’ DBT Open Source
β’ Google Pub/Sub
β’ Salesforce data integration experience
β’ Oracle/SQL server data integration experience
Soft Skills
Exceptional communication and interpersonal skills, with the ability to engage and influence business stakeholders
Strong personality and polished presence, capable of driving collaboration and alignment across teams
Thought leadership and ability to inspire teams and stakeholders in data transformation initiatives
Work Environment
Familiarity with Agile methodologies, including participation in daily stand-ups and iterative development cycles
Ability to work effectively in a fast-paced, collaborative prefer.
#TB\_EN Job #: 25-31699
Title: GCP Data Engineer
Location: Remote (However Prefer someone from Cleveland, OH)
Duration: 6+ Months
Summary
Seeking a Google Cloud Data Engineer to join a talented team to build a new and exciting data product. The ideal candidate will bring a strong technical background, leadership experience, and exceptional communication skills to engage business stakeholders and drive data transformation initiatives in an Agile environment
Responsibilities
Design, develop, and implement scalable data pipelines and associated processes using Google Cloud Platform (GCP) services
Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions
Build and maintain data architecture to support the new data product
Optimize and enhance data processing workflows for performance, reliability, and scalability
Ensure data quality, integrity, and security throughout the data lifecycle
Monitor and troubleshoot data pipelines and systems to ensure smooth operation
Document technical specifications, processes, and workflows for future reference and knowledge sharing
Qualifications
Realtime and batch data integration development experience using GCP services
Experience extracting, transforming, and loading data from various types of sources
Demonstrated ability to lead projects and teams, with experience as a technical lead or similar leadership role
Preferred
β’ financial services industry experience
β’ Technical skills
β’ SQL
β’ Google BigQuery
β’ Google Cloud Data Pipeline Tools (such as Google Cloud Datastream, DataFlow etc.)
β’ Google Cloud Composer
β’ Google Cloud Run
β’ GitHub
β’ Python
β’ Teraform
β’ Preferred
β’ DBT Open Source
β’ Google Pub/Sub
β’ Salesforce data integration experience
β’ Oracle/SQL server data integration experience
Soft Skills
Exceptional communication and interpersonal skills, with the ability to engage and influence business stakeholders
Strong personality and polished presence, capable of driving collaboration and alignment across teams
Thought leadership and ability to inspire teams and stakeholders in data transformation initiatives
Work Environment
Familiarity with Agile methodologies, including participation in daily stand-ups and iterative development cycles
Ability to work effectively in a fast-paced, collaborative prefer.
#TB\_EN Job #: 25-31699