Insight Global

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 5+ years of experience, including 3+ years on Google Cloud Platform (GCP). Contract length is unspecified, pay rate is "unknown," and it requires onsite work in Irvine, CA, 3x per week. Key skills include Python, advanced SQL, GCP data services, and team leadership.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
544
-
🗓️ - Date
April 29, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irvine, CA
-
🧠 - Skills detailed
#Data Pipeline #BigQuery #Datasets #Data Modeling #Data Processing #Data Management #Airflow #Scala #Data Engineering #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Metadata #Version Control #Data Governance #Programming #Python #Cloud #Leadership
Role description
Job Description We are looking for an experienced Senior Data Engineer with 5+ years of experience and at least 3+ years of hands‑on professional experience on Google Cloud Platform (GCP). The ideal candidate has strong expertise in building cloud‑based data pipelines and engineering solutions, can lead and guide teams, take end‑to‑end ownership of initiatives, and work flexibly in a dynamic, fast‑paced environment. The ideal candidate will sit local to Southern California to report onsite 3x per week in Irvine, CA. Key Responsibilities • Design, develop, and maintain scalable data pipelines and workflows on GCP • Work extensively with GCP data services such as Cloud Run, Cloud Functions, Composer (Airflow), Pub/Sub, BigQuery, and related services • Lead and mentor team members, provide technical guidance, and ensure high‑quality delivery • Collaborate with cross‑functional teams to understand data requirements and deliver robust data solutions • Take end‑to‑end ownership of assigned tasks and ensure timely execution • Optimize data processes for performance, reliability, scalability, and cost efficiency • Support data modeling efforts, including master data structures, to ensure consistency and reusability across enterprise datasets • Partner with data governance stakeholders to support metadata management, cataloging, and trusted data definitions Required Skills & Experience • Strong proficiency in Python programming and database technologies, including advanced SQL (mandatory) • Solid understanding of distributed data processing and cloud architecture • Experience with workflow orchestration tools, preferably Airflow / Cloud Composer • Strong experience with GCP data services such as BigQuery, Pub/Sub, and related services • Knowledge of cloud cost optimization strategies • Experience with version control systems and CI/CD pipelines • Strong analytical and problem‑solving skills • Ability to work independently, take ownership, and adapt in a fast‑paced environment • Excellent communication and team leadership skills