

Quantum World Technologies Inc.
GCP Data Engineer Architect or Lead
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer Architect or Lead in Palm Beach, FL, with a contract length open. It offers a competitive pay rate and requires 10-15 years of data engineering experience, strong SQL and Python skills, and GCP expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Palm Beach, FL
-
🧠 - Skills detailed
#GIT #Python #Data Quality #Monitoring #Data Pipeline #Leadership #Data Management #Data Engineering #Data Processing #Logging #DevOps #SQL (Structured Query Language) #Data Governance #GCP (Google Cloud Platform) #Version Control #BigQuery #"ETL (Extract #Transform #Load)" #Security #Dataflow #Automation #Apache Beam #Data Storage #SQL Queries #Scala #Data Modeling #Metadata #Cloud #Storage #Data Architecture
Role description
Role: GCP Data Engineer Architect or Lead
Location: Palm Beach , FL (Onsite)
Open for contract
Job Description:
We are looking for a highly experienced Lead Data Engineer/ Architect with deep expertise in Google Cloud Platform (GCP) to lead the design, development, and optimization of large-scale, cloud-native data platforms. The role requires strong hands-on capabilities along with technical leadership, architectural decision-making, and mentoring responsibilities.
Key Responsibilities
· Lead end-to-end design and implementation of scalable data architectures on GCP.
· Architect, develop, and optimize data pipelines and data storage solutions using Dataflow, Cloud SQL, and BigQuery.
· Own and optimize complex SQL queries and Python-based data processing frameworks replicating data from ERP systems.
· Establish best practices for data modeling, ETL/ELT frameworks, and pipeline orchestration.
· Drive DevOps and CI/CD strategies for data platforms using GCP-native tools.
· Ensure platform reliability, data quality, security, and cost optimization.
· Act as a technical advisor to stakeholders, translating business requirements into scalable data solutions.
· Mentor and guide junior and mid-level data engineers.
· Conduct design reviews, performance tuning, and production issue resolution.
Required Skills & Qualifications
· 10–15 years of overall experience in data engineering and analytics platforms.
· Strong hands-on expertise in SQL and Python.
· Extensive experience with Google Cloud Platform, including:
o BigQuery
o Cloud SQL
o Dataflow (Apache Beam)
· Proven experience in GCP DevOps, including CI/CD pipelines, monitoring, logging, and automation.
· Strong understanding of distributed systems, data warehousing, and large-scale data processing.
· Experience designing and operating high-volume, high-availability data platforms.
· Hands-on experience with version control systems (Git).
Good to Have
· Experience with real-time/streaming architecture replicating data from legacy ERP systems.
· Knowledge of data governance, metadata management, and security best practices.
· GCP certifications (Professional Data Engineer / Cloud Architect).
· Experience working in large enterprise environments (Supply Chain knowledge a plus).
--
Warm Regards!
Dheeraj Patel | US IT Recruiter
LinkedIn ID: - https://www.linkedin.com/in/dheeraj-patel-8b451825b/
Email: dheeraj.patel@quantumworldit.com
“Together we can build a Better Tomorrow”
Role: GCP Data Engineer Architect or Lead
Location: Palm Beach , FL (Onsite)
Open for contract
Job Description:
We are looking for a highly experienced Lead Data Engineer/ Architect with deep expertise in Google Cloud Platform (GCP) to lead the design, development, and optimization of large-scale, cloud-native data platforms. The role requires strong hands-on capabilities along with technical leadership, architectural decision-making, and mentoring responsibilities.
Key Responsibilities
· Lead end-to-end design and implementation of scalable data architectures on GCP.
· Architect, develop, and optimize data pipelines and data storage solutions using Dataflow, Cloud SQL, and BigQuery.
· Own and optimize complex SQL queries and Python-based data processing frameworks replicating data from ERP systems.
· Establish best practices for data modeling, ETL/ELT frameworks, and pipeline orchestration.
· Drive DevOps and CI/CD strategies for data platforms using GCP-native tools.
· Ensure platform reliability, data quality, security, and cost optimization.
· Act as a technical advisor to stakeholders, translating business requirements into scalable data solutions.
· Mentor and guide junior and mid-level data engineers.
· Conduct design reviews, performance tuning, and production issue resolution.
Required Skills & Qualifications
· 10–15 years of overall experience in data engineering and analytics platforms.
· Strong hands-on expertise in SQL and Python.
· Extensive experience with Google Cloud Platform, including:
o BigQuery
o Cloud SQL
o Dataflow (Apache Beam)
· Proven experience in GCP DevOps, including CI/CD pipelines, monitoring, logging, and automation.
· Strong understanding of distributed systems, data warehousing, and large-scale data processing.
· Experience designing and operating high-volume, high-availability data platforms.
· Hands-on experience with version control systems (Git).
Good to Have
· Experience with real-time/streaming architecture replicating data from legacy ERP systems.
· Knowledge of data governance, metadata management, and security best practices.
· GCP certifications (Professional Data Engineer / Cloud Architect).
· Experience working in large enterprise environments (Supply Chain knowledge a plus).
--
Warm Regards!
Dheeraj Patel | US IT Recruiter
LinkedIn ID: - https://www.linkedin.com/in/dheeraj-patel-8b451825b/
Email: dheeraj.patel@quantumworldit.com
“Together we can build a Better Tomorrow”






