Sr. GCP Data Engineer (15+ Years Mandatory)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. GCP Data Engineer in Sunnyvale, CA, on a long-term contract. Requires 15+ years of experience, strong GCP skills, and expertise in big data technologies. Bachelor's degree in computer science is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 17, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Sunnyvale, CA
🧠 - Skills detailed
#Data Lake #Agile #Scrum #Kafka (Apache Kafka) #Scripting #Cloud #Data Warehouse #Apache Airflow #Data Pipeline #Perl #Hadoop #Apache Hive #Big Data #Data Processing #GCP (Google Cloud Platform) #Airflow #Consulting #Jira #Data Engineering #Automated Testing #Java #BigQuery #Computer Science #Apache Kafka #Jenkins #Programming #Scala #Consul #Spark (Apache Spark) #Physical Data Model #Apache Spark #RDBMS (Relational Database Management System) #BitBucket #Python
Role description
Cloudious LLC is one of the fastest emerging IT Solutions and Services company headquartered in San Jose, CA with their global offices in Canada, EMEA & APAC. We are currently hiring a seasoned Sr. GCP Data Engineer who comes with a strong consulting mindset Sr. GCP Data Engineer Sunnyvale, CA- Onsite. Long Term Contract (15+ Years mandatory) As a Senior Data Engineer, you will Design and develop big data applications using the latest open source technologies. Desired working in offshore model and Managed outcome Develop logical and physical data models for big data platforms. Automate workflows using Apache Airflow. Create data pipelines using Apache Hive, Apache Spark, Apache Kafka. Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support. Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team. Mentor junior engineers on the team Lead daily standups and design reviews Groom and prioritize backlog using JIRA Act as the point of contact for your assigned business domain Requirements: GCP Experience 4+ years of recent GCP experience Experience building data pipelines in GCP GCP Dataproc, GCS & BIGQuery experience 10+ years of hands-on experience with developing data warehouse solutions and data products. 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required 5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms. Experience with programming languages: Python, Java, Scala, etc. Experience with scripting languages: Perl, Shell, etc. Practice working with, processing, and managing large data sets (multi TB/PB scale). Exposure to test driven development and automated testing frameworks. Background in Scrum/Agile development methodologies. Capable of delivering on multiple competing priorities with little supervision. Excellent verbal and written communication skills. Bachelor's Degree in computer science or equivalent experience. The most successful candidates will also have experience in the following: Gitflow Atlassian products – BitBucket, JIRA, Confluence etc. Continuous Integration tools such as Bamboo, Jenkins, or TFS