Realign LLC

GCP Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect with a contract length of "unknown" and a pay rate of "unknown," located in San Jose, CA. Key skills include GCP expertise, Python, SQL, data pipeline orchestration, and DevOps practices.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 1, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Libraries #Azure #Data Science #Data Modeling #Big Data #Data Processing #Datasets #Scala #BigQuery #Batch #BI (Business Intelligence) #Java #Dataflow #GCP (Google Cloud Platform) #Storage #Airflow #DevOps #Business Analysis #Deployment #GIT #Data Engineering #Programming #Data Pipeline #Version Control #ML (Machine Learning) #Cloud #Data Ingestion #GitHub #Python #Apache Airflow #SQL (Structured Query Language) #GitLab #Data Architecture
Role description
Job Type: Contract Job Category: IT Role: GCP Data Architect Location: San Jose, CA Job Description: We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment. Core Responsibilities: Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasetsto support high-volume reporting, business analysis, and data science model development. Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaminganalytics. Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions. Required Skills & Experience: Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically: o BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment. o Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub Programming & Querying: o Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries o SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning. Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar). DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes. Required Skills CLOUD ARCHITECT WITH AZURE AND JAVA DATA ARCHITECT ENTERPRISE DATA CENTER ARCHITECT ENTERPRISE INFRASTRUCTURE ARCHITECT