TestingXperts

Full Time || Data Engineering Principal Architect with Strong in GCP || Remote with Travel (Only Independent Visa)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Full-Time Data Engineering Principal Architect with strong GCP expertise, requiring 13-18 years of experience in data engineering, particularly in the retail domain. Remote work with travel; pay rate unspecified. Bachelor's degree required; certifications in GCP preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 27, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
California, United States
-
🧠 - Skills detailed
#DevOps #Data Science #Python #"ETL (Extract #Transform #Load)" #Data Architecture #Monitoring #AI (Artificial Intelligence) #Computer Science #BigQuery #Dataflow #Deployment #Scala #Kubernetes #ML (Machine Learning) #Agile #Data Engineering #Programming #SQL (Structured Query Language) #Docker #GCP (Google Cloud Platform) #Cloud #Data Pipeline
Role description
Data EngineeringΒ Principal Architect with Strong in GCP Location: Remote with some travel. (San Jose, CA) Full Time Only GC & USC. As a DE Principal Architect you will lead the design and implementation of advanced data engineering solutions leveraging Google Cloud Platform technologies within the retail domain. Success in this role means you will architect scalable, robust data pipelines and infrastructure that empower analytics and AI adoption across client organizations. You will collaborate closely with business stakeholders and technical teams to translate complex requirements into effective technical architectures. The Design and architect end-to-end scalable data engineering solutions using Google Cloud Platform services such as BigQuery, Bigtable, DataFlow, DataProc, DataStore, and Composer. - Lead development and deployment of data pipelines using Python and SQL to ingest, process, and transform retail data at scale. - Drive cloud DevOps practices to ensure continuous integration, delivery, and operational excellence of data platforms. - Collaborate with data scientists, analysts, and software engineers to enable last mile adoption of insights through robust data infrastructure. - Provide technical guidance and mentorship to engineering teams, fostering best practices in cloud data architecture and engineering. - Engage with clients and stakeholders to understand business goals and translate them into scalable technical solutions aligned with industry standards. You will need: - Demonstrate 13-18 years of experience in data engineering, with strong expertise in Google Cloud Platform technologies including Composer, BigQuery, Bigtable, DataFlow, DataProc, and DataStore. - Apply advanced skills in Python programming and SQL query development to build and optimize data workflows. - Utilize DevOps principles and tools to implement automated cloud-based deployment and monitoring of data solutions. - Hold a Bachelor’s degree in Computer Science, Engineering, or a related technical discipline. - Exhibit strong architectural design capabilities and experience delivering large-scale data engineering projects in the retail sector or similar industries. - Communicate effectively with cross-functional teams and stakeholders to ensure alignment and successful project outcomes. Good to Have Skills: - Possess certifications related to Google Cloud Platform, such as Google Professional Data Engineer or Google Cloud Architect. - Gain familiarity with containerization technologies (e.g., Docker, Kubernetes) to support cloud-native data engineering operations. - Leverage experience working in Agile development environments to drive iterative delivery and continuous improvement. - Understand machine learning workflows and how to architect data platforms that support AI/ML initiatives. - Develop expertise in additional cloud platforms or data engineering technologies to enhance cross-platform solution design.