Sr Google Cloud Platform Data Engineers

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Google Cloud Platform Data Engineer, a 12-month contract position offering competitive pay. Requires 12+ years IT experience, 4+ years in a major bank, and extensive GCP, Dataflow, BigQuery, Python, and Apache Spark expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
464
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Security #GCP (Google Cloud Platform) #Spark (Apache Spark) #IAM (Identity and Access Management) #Data Storage #Data Processing #Dataflow #Apache Beam #"ETL (Extract #Transform #Load)" #Scripting #BigQuery #Data Ingestion #Cloud #Monitoring #SQL (Structured Query Language) #Python #Data Pipeline #Data Manipulation #Apache Spark #Data Engineering #Data Security #Storage
Role description
Seeking Senior Google Cloud Platform Data Engineers to design, develop, and maintain robust data pipelines that transform raw data into valuable insights, as well as: Design, develop, and maintain data pipelines using GCP services like Dataflow, Dataproc, and Pub/Sub, in addition to developing and implementing data ingestion and transformation processes using tools like Apache Beam and Apache Spark. Manage and optimize data storage solutions on GCP, including Big Query, Cloud Storage, and Cloud SQL, while implementing data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center. Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Collaborate with data experts, analysts, and product teams to understand data needs and deliver effective solutions. Automate data processing tasks using scripting languages like Python. Requirements Skills Required/Preferred Years Candidate Experience Must have 12 year plus of IT experience. Required 12+ Must have 4+ years of β€˜Recent’ experience (With in last 6 years / since 2019) working for a Major Bank or Brokerage house in the US working with GCP. Required 4+ Must have 8 plus years of experience as a Data Engineer designing, developing, and deploying data pipelines. Required 8+ Must have 5 plus years of experience with Google Cloud Platform (GCP). Required 5+ Must have 5 plus years of experience with GCP services Dataflow, Dataproc, and Pub/Sub, Required 5+ Must have 5 plus years of experience with core data services GCS, BigQuery, Cloud Storage, and Dataflow. Required 5+ Must have 5 plus years experience managing and optimizing data storage Big Query, Cloud Storage, and Cloud SQL. Required 5+ Must have 5 plus years implementing data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center. Required 5+ Must have 5 plus years of experience with Python & SQL for data manipulation and querying Required 5+ Must have 5 plus years of experience with the distributed data processing framework Apache Beam and Apache Spark. Required 5+ Experience with data security and access control principles Required Benefits Benefit Package includes: Paid Sick Time Insurance for Medical, Dental, Vision and Life Available 401(k) including Employer Match HSA, Short-term & Long-term Disability Available We are an EEO/Veterans/Disabled employer