

CirrusLabs
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect/Engineer focused on Google Cloud Platform, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, GCP services, ETL processes, and CI/CD pipeline development. Experience in migrating legacy systems is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Issaquah, WA
-
🧠 - Skills detailed
#JSON (JavaScript Object Notation) #Data Analysis #Python #Deployment #GCP (Google Cloud Platform) #REST API #"ETL (Extract #Transform #Load)" #Distributed Computing #GitHub #Disaster Recovery #REST (Representational State Transfer) #Scripting #Data Architecture #Cloud #Terraform #Storage #API (Application Programming Interface) #Programming #Data Integration #Migration #Security #Apache Beam #Spark (Apache Spark) #Dataflow #BigQuery #Compliance #SQL (Structured Query Language) #Scala #DevOps #Databases #Perl #Batch #Data Pipeline #Datasets
Role description
Job Summary: Data Architect/Engineer (Google Cloud Platform)
Key Responsibilities
• Data Pipeline Development: Design, build, test, and maintain scalable data pipelines and ETL processes using Python and GCP services (e.g., Dataflow, BigQuery, Pub/Sub).
• Data Integration & Modeling: Implement batch and real-time data integration workflows, optimize data models and architecture for performance and storage efficiency.
• Collaboration & Support: Work with cross-functional teams to gather data requirements and support data analysts with curated datasets and tools.
• System Reliability: Monitor, troubleshoot, and tune data systems for high availability, scalability, and disaster recovery.
• DevOps Enablement: Build and manage CI/CD pipelines using GitHub and Terraform; ensure security compliance and operational readiness.
Mandatory Skills & Qualifications
• Technical Expertise:
• Strong Python programming and Spark experience for data analytics.
• Proficient in GCP services: GCS, Dataflow, Cloud Functions, Composer, Scheduler, Datastream, Pub/Sub, BigQuery, Dataproc.
• Skilled in Apache Beam for batch and stream processing.
• Experience with REST API ingestion, JSON messaging, and scripting (Shell, Perl).
• Deep understanding of SQL, cloud-native databases, and data warehousing concepts.
• Engineering & Migration:
• Proven experience in migrating legacy systems to modern cloud-based architectures.
• Familiarity with distributed computing frameworks and large-scale data handling.
DevOps & Security:
• CI/CD pipeline development with GitHub and Terraform.
• Security integration in deployment workflows.
Soft Skills:
• Strong problem-solving and analytical abilities.
• Excellent communication and teamwork skills.
Job Summary: Data Architect/Engineer (Google Cloud Platform)
Key Responsibilities
• Data Pipeline Development: Design, build, test, and maintain scalable data pipelines and ETL processes using Python and GCP services (e.g., Dataflow, BigQuery, Pub/Sub).
• Data Integration & Modeling: Implement batch and real-time data integration workflows, optimize data models and architecture for performance and storage efficiency.
• Collaboration & Support: Work with cross-functional teams to gather data requirements and support data analysts with curated datasets and tools.
• System Reliability: Monitor, troubleshoot, and tune data systems for high availability, scalability, and disaster recovery.
• DevOps Enablement: Build and manage CI/CD pipelines using GitHub and Terraform; ensure security compliance and operational readiness.
Mandatory Skills & Qualifications
• Technical Expertise:
• Strong Python programming and Spark experience for data analytics.
• Proficient in GCP services: GCS, Dataflow, Cloud Functions, Composer, Scheduler, Datastream, Pub/Sub, BigQuery, Dataproc.
• Skilled in Apache Beam for batch and stream processing.
• Experience with REST API ingestion, JSON messaging, and scripting (Shell, Perl).
• Deep understanding of SQL, cloud-native databases, and data warehousing concepts.
• Engineering & Migration:
• Proven experience in migrating legacy systems to modern cloud-based architectures.
• Familiarity with distributed computing frameworks and large-scale data handling.
DevOps & Security:
• CI/CD pipeline development with GitHub and Terraform.
• Security integration in deployment workflows.
Soft Skills:
• Strong problem-solving and analytical abilities.
• Excellent communication and teamwork skills.






