GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Philadelphia, PA, with a 5-day onsite contract. Requires 3+ years of experience in GCP services, programming in Java/Python/SQL, and familiarity with DevOps tools. GCP certification preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 4, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Philadelphia, PA
-
🧠 - Skills detailed
#IAM (Identity and Access Management) #Docker #Python #BigQuery #Dataflow #SQL (Structured Query Language) #Apache Airflow #Storage #Automation #GitHub #Apache Beam #Jenkins #Computer Science #Data Warehouse #Scala #DevOps #Agile #Monitoring #GCP (Google Cloud Platform) #Google Analytics #Security #Programming #Data Engineering #Data Processing #Data Lake #Automated Testing #Deployment #Big Data #Airflow #Cloud #"ETL (Extract #Transform #Load)" #Hadoop #DMS (Data Migration Service) #Kubernetes #Data Pipeline #Terraform #Java
Role description
Job Opportunity: GCP Data Engineer πŸ“ Location: Philadelphia, PA (Fully Onsite) πŸ’Ό Client Domain: Media πŸ’΅ 5 DAYS ONSITE. 🧾 Job Description We are seeking a GCP Data Engineer with strong hands-on experience in designing and deploying data pipelines and infrastructure using Google Cloud Platform (GCP) services. βœ… Core Responsibilities 🌐 Cloud & Data Engineering (GCP) β€’ 3+ years of hands-on experience with: β€’ BigQuery β€’ Pub/Sub β€’ Dataflow / Apache Beam β€’ Cloud Composer / Apache Airflow β€’ Cloud Functions β€’ Cloud Storage β€’ Strong understanding of: β€’ Data lakes, data warehouses, and analytics platforms at scale Programming & Development β€’ Proficient in: β€’ Java, Python, and SQL β€’ Skilled in building scalable data processing and transformation pipelines DevOps & CICD β€’ Experience with: β€’ DevOps pipelines and CI/CD β€’ Automated testing and deployment β€’ Tools such as Jenkins, GitHub Actions, and Cloud Build Infrastructure & Architecture β€’ Ability to design and deploy fault-tolerant systems on GCP β€’ Experience with: β€’ Compute Engine, App Engine, Kubernetes Engine, Cloud Functions β€’ Infrastructure automation with: β€’ Terraform or Google Cloud Deployment Manager Monitoring & Security β€’ Use of GCP Operations Suite (Stackdriver) for system and performance monitoring β€’ Implementation of IAM roles, service accounts, and security policies Preferred Qualifications β€’ GCP Certification (e.g., Professional Data Engineer) β€’ Experience with: β€’ Agile methodologies β€’ Docker and Kubernetes β€’ Bachelor’s degree in Computer Science, IT, or related field Mandatory Skills β€’ GCP Services: β€’ GCP Storage, BigQuery, DataProc, Dataflow, Data Fusion, Datastream, Cloud Composer, Cloud Pub/Sub, Workflows, DMS, Dataform, Google Analytics Hub β€’ Data & Workflow Tools: β€’ Apache Airflow, GCP Dataflow, GCP Data Flow β€’ Languages: β€’ Java, Python, Scala, ANSI-SQL β€’ Big Data Ecosystem: β€’ Hadoop and related technologies