

Realign LLC
GCP Data Architect-5
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "GCP Data Architect" on a contract basis, located in San Jose/Santa Clara, CA, or Plano, TX. It requires expertise in GCP services, Python, SQL, and data pipeline orchestration, focusing on big data solutions and cloud environments.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
Unknown
-
ποΈ - Date
November 12, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Santa Clara, CA
-
π§ - Skills detailed
#Big Data #SQL (Structured Query Language) #Batch #GIT #Data Ingestion #Data Engineering #Airflow #BI (Business Intelligence) #BigQuery #Programming #Storage #GCP (Google Cloud Platform) #Dataflow #Data Processing #GitHub #Data Science #Deployment #Libraries #GitLab #Datasets #DevOps #Data Pipeline #Cloud #Data Architecture #ML (Machine Learning) #Scala #Version Control #Python #Business Analysis #Apache Airflow #Data Modeling
Role description
Job Type: Contract
Job Category: IT
Role: GCP Data Architect Location - San Jose / Santa Clara, CA and Plano, TX (Day 1 Onsite) Contract & FTE Both
Job Description: We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment.
Core Responsibilities
Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasetsto support high-volume reporting, business analysis, and data science model development.
Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaminganalytics.
Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
Required Skills & Experience
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:
BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub
Programming & Querying:
Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries
SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.
Required Skills
PERFORMANCE ARCHITECT
Job Type: Contract
Job Category: IT
Role: GCP Data Architect Location - San Jose / Santa Clara, CA and Plano, TX (Day 1 Onsite) Contract & FTE Both
Job Description: We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment.
Core Responsibilities
Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasetsto support high-volume reporting, business analysis, and data science model development.
Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaminganalytics.
Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
Required Skills & Experience
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:
BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub
Programming & Querying:
Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries
SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.
Required Skills
PERFORMANCE ARCHITECT





