E-Solutions

Google Cloud Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Data Engineer with a 6-month contract at a pay rate of "X". Requires 10+ years of GCP experience, strong Python/SQL skills, and expertise in data governance and CI/CD pipelines.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 25, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#Jenkins #Docker #NoSQL #Data Governance #Documentation #Clustering #Logging #BigQuery #Java #Kubernetes #Terraform #VPC (Virtual Private Cloud) #SQL (Structured Query Language) #Computer Science #Dataflow #Cloud #Apache Beam #Storage #Data Pipeline #Security #Scala #DevOps #Data Security #Database Design #Data Modeling #"ETL (Extract #Transform #Load)" #Monitoring #Data Integration #Spark (Apache Spark) #Airflow #Data Engineering #IAM (Identity and Access Management) #Python #GCP (Google Cloud Platform)
Role description
Designs, builds, and optimizes secure, scalable, and high-performance data pipelines and analytics solutions using Google Cloud Platform tools like BigQuery, Dataflow, Dataproc, Composer, GCS, Cloud function, Cloud run and Pub/Sub. This role requires 10+ years of experience, expertise in Python/SQL, and implementing data governance and CI/CD pipelines. Key Responsibilities β€’ Pipeline Development: Design, build, and optimize end-to-end data pipelines using GCP-native services (Dataflow, Dataproc, Cloud Storage) and Python. β€’ Data Modeling & Architecture: Create high-quality, reproducible data models in BigQuery using partitioning, clustering, and materialized views to enhance performance and manage costs. β€’ Streaming & Real-time: Implement real-time streaming pipelines using Pub/Sub and Apache Beam/Spark Streaming. β€’ Infrastructure & DevOps: Establish CI/CD pipelines for data workflows. β€’ Security & Governance: Implement best practices for data security, including IAM roles, encryption (CMEK), and VPC Service Controls. β€’ Collaboration: Work with stakeholders to define requirements, mentor junior engineers, and produce technical documentation. Required Technical Skills β€’ Platforms: Deep expertise in Google Cloud Platform (GCP). β€’ Languages: Strong SQL and Python (or Java) proficiency. β€’ Tools: BigQuery, Dataflow, Cloud Composer (Airflow), Pub/Sub, Cloud Storage, Dataproc, Cloud sql, Cloud run, Cloud function, logging and monitoring β€’ Data Modeling: Database design, ETL/ELT workflows. β€’ DevOps: Terraform, Jenkins. Qualifications β€’ Experience: 10+ years in GCP data engineering β€’ Must have delivered at least 4 to 5 end to end projects as a Senior data engineer using GCP Services β€’ Strong understanding of database design, data modeling (relational, dimensional, NoSQL). β€’ Expertise in data integration, ETL/ELT, and data pipeline development. β€’ Knowledge of cloud security best practices, identity management, and networking. β€’ Familiarity with DevOps, CI/CD, and containerization (Docker, Kubernetes). β€’ Excellent communication and problem-solving skills. β€’ Education: Bachelor’s degree in computer science, Engineering, or relevant field. β€’ Certifications: Google Cloud Professional Data Engineer certification is highly preferred