MokshaaLLC

Senior GCP Data Engineer (BigQuery, Airflow, Python)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Engineer in Santa Clara, CA or Plano, TX, with a contract length of unspecified duration. Pay starts at $62/hr. Key skills include GCP (BigQuery), Python, SQL, and Apache Airflow expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
496
-
🗓️ - Date
November 1, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Santa Clara, CA
-
🧠 - Skills detailed
#Libraries #Data Science #Data Modeling #Big Data #Data Processing #Datasets #Scala #BigQuery #Batch #BI (Business Intelligence) #Storage #Dataflow #GCP (Google Cloud Platform) #Airflow #DevOps #Business Analysis #Deployment #GIT #Data Engineering #Programming #Data Pipeline #Version Control #ML (Machine Learning) #Cloud #Data Ingestion #GitHub #Python #Apache Airflow #SQL (Structured Query Language) #GitLab
Role description
Senior Data Engineer (BigQuery, Airflow, Python) Location: Santa Clara, CA and Plano, TX (Day 1 Onsite – 2 Openings) Pay Rate Range : $62/hr onwards ( negotiable based on experience and engagement type) • • • • • • • • • • • • • • • • • • • • • OPEN FOR W2 / C2C / 1099 • • • • • • • • • • • • • • • • • • • • • • • • • • • • • Job Description: We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment. Core Responsibilities • Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development. • Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics • Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics. • Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions. Required Skills & Experience • Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically: • BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment. • Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub • Programming & Querying: • Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries • SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning. • Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar). • DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes Interested candidates, pls apply or share your resumes to jobreqs@mokshaallc.in