

Rivago Infotech Inc
Lead Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Architect with 10–15 years of experience, specializing in Google Cloud Platform. Contract length is unspecified, with a competitive pay rate. Key skills include SQL, Python, and GCP expertise, particularly in BigQuery and Dataflow.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Palm Beach, FL
-
🧠 - Skills detailed
#GIT #Data Management #Scala #SQL Queries #Data Governance #Data Pipeline #Version Control #Data Storage #Data Architecture #Data Engineering #DevOps #Storage #BigQuery #Metadata #Security #Automation #Monitoring #"ETL (Extract #Transform #Load)" #Apache Beam #Data Quality #GCP (Google Cloud Platform) #Leadership #Data Modeling #Data Processing #Dataflow #Logging #Cloud #SQL (Structured Query Language) #Python
Role description
Job Description:
We are looking for a highly experienced Lead Data Engineer/ Architect with deep expertise in Google Cloud Platform (GCP) to lead the design, development, and optimization of large-scale, cloud-native data platforms. The role requires strong hands-on capabilities along with technical leadership, architectural decision-making, and mentoring responsibilities.
Key Responsibilities
· Lead end-to-end design and implementation of scalable data architectures on GCP.
· Architect, develop, and optimize data pipelines and data storage solutions using Dataflow, Cloud SQL, and BigQuery.
· Own and optimize complex SQL queries and Python-based data processing frameworks replicating data from ERP systems.
· Establish best practices for data modeling, ETL/ELT frameworks, and pipeline orchestration.
· Drive DevOps and CI/CD strategies for data platforms using GCP-native tools.
· Ensure platform reliability, data quality, security, and cost optimization.
· Act as a technical advisor to stakeholders, translating business requirements into scalable data solutions.
· Mentor and guide junior and mid-level data engineers.
· Conduct design reviews, performance tuning, and production issue resolution.
Required Skills & Qualifications:
· 10–15 years of overall experience in data engineering and analytics platforms.
· Strong hands-on expertise in SQL and Python.
· Extensive experience with Google Cloud Platform, including:
o BigQuery
o Cloud SQL
o Dataflow (Apache Beam)
· Proven experience in GCP DevOps, including CI/CD pipelines, monitoring, logging, and automation.
· Strong understanding of distributed systems, data warehousing, and large-scale data processing.
· Experience designing and operating high-volume, high-availability data platforms.
· Hands-on experience with version control systems (Git).
Good to Have
· Experience with real-time/streaming architecture replicating data from legacy ERP systems.
· Knowledge of data governance, metadata management, and security best practices.
· GCP certifications (Professional Data Engineer / Cloud Architect).
· Experience working in large enterprise environments (Supply Chain knowledge a plus).
Job Description:
We are looking for a highly experienced Lead Data Engineer/ Architect with deep expertise in Google Cloud Platform (GCP) to lead the design, development, and optimization of large-scale, cloud-native data platforms. The role requires strong hands-on capabilities along with technical leadership, architectural decision-making, and mentoring responsibilities.
Key Responsibilities
· Lead end-to-end design and implementation of scalable data architectures on GCP.
· Architect, develop, and optimize data pipelines and data storage solutions using Dataflow, Cloud SQL, and BigQuery.
· Own and optimize complex SQL queries and Python-based data processing frameworks replicating data from ERP systems.
· Establish best practices for data modeling, ETL/ELT frameworks, and pipeline orchestration.
· Drive DevOps and CI/CD strategies for data platforms using GCP-native tools.
· Ensure platform reliability, data quality, security, and cost optimization.
· Act as a technical advisor to stakeholders, translating business requirements into scalable data solutions.
· Mentor and guide junior and mid-level data engineers.
· Conduct design reviews, performance tuning, and production issue resolution.
Required Skills & Qualifications:
· 10–15 years of overall experience in data engineering and analytics platforms.
· Strong hands-on expertise in SQL and Python.
· Extensive experience with Google Cloud Platform, including:
o BigQuery
o Cloud SQL
o Dataflow (Apache Beam)
· Proven experience in GCP DevOps, including CI/CD pipelines, monitoring, logging, and automation.
· Strong understanding of distributed systems, data warehousing, and large-scale data processing.
· Experience designing and operating high-volume, high-availability data platforms.
· Hands-on experience with version control systems (Git).
Good to Have
· Experience with real-time/streaming architecture replicating data from legacy ERP systems.
· Knowledge of data governance, metadata management, and security best practices.
· GCP certifications (Professional Data Engineer / Cloud Architect).
· Experience working in large enterprise environments (Supply Chain knowledge a plus).






