Wimora

Google Cloud Platform (GCP) Python Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Google Cloud Platform (GCP) Python Developer on a contract basis, requiring 10+ years of Python experience, strong GCP expertise, and a Bachelor's degree. Key skills include event-driven architecture and RPC-style APIs. Pay rate and location are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Data Transformations #Python #Dataflow #Documentation #Scala #Deployment #"ETL (Extract #Transform #Load)" #Data Engineering #GCP (Google Cloud Platform) #Monitoring #Data Pipeline #Cloud #JSON (JavaScript Object Notation)
Role description
Experience level: MidSenior Experience required: 10+ years Job function: Information Technology Industry: Information Technology & Services Employment type: Contract Visa sponsorship: No Education: Bachelor's degree required About The Role Our client is seeking a highly skilled Google Cloud Platform (GCP) Python Developer to design, build, and optimize backend services and event-driven data pipelines. This role is ideal for someone with strong Python expertise and deep hands-on experience across GCP services, distributed systems, and modern event-driven architectures. You will be responsible for developing backend components, integrating cloud-native services, orchestrating asynchronous systems, and ensuring robust schema management across multiple data workflows. Responsibilities • Design and develop backend services and applications using Python and GCP SDKs. • Build and maintain event-driven architectures using Pub/Sub, Dataflow, Cloud Functions, and Cloud Run. • Implement asynchronous communication and distributed system patterns, ensuring effective event choreography without central orchestrators. • Develop RPC-style APIs and backend communication frameworks within GCP environments. • Design, define, and evolve Avro schemas for event serialization and validation. • Integrate and optimize cloud-native workflows for scalability, performance, and reliability. • Collaborate with cross-functional teams to ensure smooth deployment and operational support. • Write clean, maintainable, testable code with strong documentation. Must-Have Skills • 10+ years of hands-on Python development experience. • Strong expertise with GCP services, specifically: • Pub/Sub • Dataflow • Cloud Functions • Cloud Run • Deep understanding of event-driven architecture, including choreography patterns, asynchronous messaging, and distributed systems. • Experience with serialization formats, especially Avro for schema definition and evolution. • Familiarity with RPC-style APIs and backend service communication patterns. • Strong problem-solving skills in cloud-native environments. Good-to-Have Skills • Experience with other serialization formats: Protobuf, JSON, etc. • Background in building pipelines or data transformations using GCP Data Engineering tools. • Experience deploying services in highly scalable, production-grade cloud environments. • Knowledge of CI/CD, monitoring, and infrastructure best practices on GCP. Ideal Candidate The ideal candidate is a strong backend engineer with a passion for scalable distributed systems, event-driven architectures, and high-performance cloud services. They can independently take ownership of cloud pipeline workflows, write production-grade Python code, and collaborate effectively in a cloud-native engineering environment.