

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Philadelphia, PA, with a 5-day onsite contract. Requires 3+ years of experience in GCP services, programming in Java/Python/SQL, and familiarity with DevOps tools. GCP certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 4, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Philadelphia, PA
-
π§ - Skills detailed
#IAM (Identity and Access Management) #Docker #Python #BigQuery #Dataflow #SQL (Structured Query Language) #Apache Airflow #Storage #Automation #GitHub #Apache Beam #Jenkins #Computer Science #Data Warehouse #Scala #DevOps #Agile #Monitoring #GCP (Google Cloud Platform) #Google Analytics #Security #Programming #Data Engineering #Data Processing #Data Lake #Automated Testing #Deployment #Big Data #Airflow #Cloud #"ETL (Extract #Transform #Load)" #Hadoop #DMS (Data Migration Service) #Kubernetes #Data Pipeline #Terraform #Java
Role description
Job Opportunity: GCP Data Engineer
π Location: Philadelphia, PA (Fully Onsite)
πΌ Client Domain: Media
π΅
5 DAYS ONSITE.
π§Ύ Job Description
We are seeking a GCP Data Engineer with strong hands-on experience in designing and deploying data pipelines and infrastructure using Google Cloud Platform (GCP) services.
β
Core Responsibilities
π Cloud & Data Engineering (GCP)
β’ 3+ years of hands-on experience with:
β’ BigQuery
β’ Pub/Sub
β’ Dataflow / Apache Beam
β’ Cloud Composer / Apache Airflow
β’ Cloud Functions
β’ Cloud Storage
β’ Strong understanding of:
β’ Data lakes, data warehouses, and analytics platforms at scale
Programming & Development
β’ Proficient in:
β’ Java, Python, and SQL
β’ Skilled in building scalable data processing and transformation pipelines
DevOps & CICD
β’ Experience with:
β’ DevOps pipelines and CI/CD
β’ Automated testing and deployment
β’ Tools such as Jenkins, GitHub Actions, and Cloud Build
Infrastructure & Architecture
β’ Ability to design and deploy fault-tolerant systems on GCP
β’ Experience with:
β’ Compute Engine, App Engine, Kubernetes Engine, Cloud Functions
β’ Infrastructure automation with:
β’ Terraform or Google Cloud Deployment Manager
Monitoring & Security
β’ Use of GCP Operations Suite (Stackdriver) for system and performance monitoring
β’ Implementation of IAM roles, service accounts, and security policies
Preferred Qualifications
β’ GCP Certification (e.g., Professional Data Engineer)
β’ Experience with:
β’ Agile methodologies
β’ Docker and Kubernetes
β’ Bachelorβs degree in Computer Science, IT, or related field
Mandatory Skills
β’ GCP Services:
β’ GCP Storage, BigQuery, DataProc, Dataflow, Data Fusion, Datastream, Cloud Composer, Cloud Pub/Sub, Workflows, DMS, Dataform, Google Analytics Hub
β’ Data & Workflow Tools:
β’ Apache Airflow, GCP Dataflow, GCP Data Flow
β’ Languages:
β’ Java, Python, Scala, ANSI-SQL
β’ Big Data Ecosystem:
β’ Hadoop and related technologies
Job Opportunity: GCP Data Engineer
π Location: Philadelphia, PA (Fully Onsite)
πΌ Client Domain: Media
π΅
5 DAYS ONSITE.
π§Ύ Job Description
We are seeking a GCP Data Engineer with strong hands-on experience in designing and deploying data pipelines and infrastructure using Google Cloud Platform (GCP) services.
β
Core Responsibilities
π Cloud & Data Engineering (GCP)
β’ 3+ years of hands-on experience with:
β’ BigQuery
β’ Pub/Sub
β’ Dataflow / Apache Beam
β’ Cloud Composer / Apache Airflow
β’ Cloud Functions
β’ Cloud Storage
β’ Strong understanding of:
β’ Data lakes, data warehouses, and analytics platforms at scale
Programming & Development
β’ Proficient in:
β’ Java, Python, and SQL
β’ Skilled in building scalable data processing and transformation pipelines
DevOps & CICD
β’ Experience with:
β’ DevOps pipelines and CI/CD
β’ Automated testing and deployment
β’ Tools such as Jenkins, GitHub Actions, and Cloud Build
Infrastructure & Architecture
β’ Ability to design and deploy fault-tolerant systems on GCP
β’ Experience with:
β’ Compute Engine, App Engine, Kubernetes Engine, Cloud Functions
β’ Infrastructure automation with:
β’ Terraform or Google Cloud Deployment Manager
Monitoring & Security
β’ Use of GCP Operations Suite (Stackdriver) for system and performance monitoring
β’ Implementation of IAM roles, service accounts, and security policies
Preferred Qualifications
β’ GCP Certification (e.g., Professional Data Engineer)
β’ Experience with:
β’ Agile methodologies
β’ Docker and Kubernetes
β’ Bachelorβs degree in Computer Science, IT, or related field
Mandatory Skills
β’ GCP Services:
β’ GCP Storage, BigQuery, DataProc, Dataflow, Data Fusion, Datastream, Cloud Composer, Cloud Pub/Sub, Workflows, DMS, Dataform, Google Analytics Hub
β’ Data & Workflow Tools:
β’ Apache Airflow, GCP Dataflow, GCP Data Flow
β’ Languages:
β’ Java, Python, Scala, ANSI-SQL
β’ Big Data Ecosystem:
β’ Hadoop and related technologies