Programmers.io

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer, contract length unspecified, offering competitive pay. Key skills include GCP expertise, Python, SQL, data pipeline orchestration, and DevOps practices. Experience with BigQuery and automation tools is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Santa Clara County, CA
-
🧠 - Skills detailed
#Airflow #Business Analysis #Storage #Datasets #Docker #Jenkins #Cloud #Monitoring #Data Ingestion #Deployment #BI (Business Intelligence) #Automation #Data Pipeline #Data Processing #Programming #GitHub #Version Control #Data Science #Kubernetes #GitLab #Batch #Big Data #Data Modeling #Dataflow #Python #Apache Airflow #ML (Machine Learning) #SQL (Structured Query Language) #DevOps #GIT #Scala #BigQuery #Data Engineering #GCP (Google Cloud Platform) #Libraries
Role description
Note: (GC, GC-EAD, OPT, CPT on C2C will not be workable) We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment. Core Responsibilities • Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development. • Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence and advanced analytics. • Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics. • Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions. • DevOps Integration: Collaborate with DevOps teams to ensure smooth deployment, monitoring, and maintenance of data pipelines and infrastructure in cloud environments. Required Skills & Experience • Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically: o BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment. o Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub. • Programming & Querying: o Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries. o SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning. • Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Dagster, or similar). DevOps/CI/CD: o Strong understanding of DevOps principles and practices. o Experience with CI/CD pipelines, automation tools, and deployment strategies. o Familiarity with version control systems (Git) and tools like GitLab CI/CD, GitHub Actions, or Jenkins. o Knowledge of containerization (Docker) and orchestration tools (Kubernetes) is a plus. • Monitoring & Automation: Ability to implement monitoring solutions and automate operational tasks to ensure reliability and scalability.