

Sapphire Software Solutions Inc
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a 1-year contract, 100% remote, requiring 4-6 years of experience. Key skills include SQL, ETL/BI, GCP, BigQuery, Python, and strong communication. Experience in high-velocity tech environments is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#dbt (data build tool) #GCP (Google Cloud Platform) #BigQuery #Computer Science #Jenkins #Project Management #SQL (Structured Query Language) #BI (Business Intelligence) #Data Engineering #Python #EDW (Enterprise Data Warehouse) #Data Warehouse #"ETL (Extract #Transform #Load)"
Role description
+1 571-556-1002 |naresh@sapphiresoftwaresolutions.com
Hiring GCP Data Engineer -Only W2 || 100% Remote
Hi Folks, Please check the JD and share your updated resume
GCP Data Engineer
1 year contract
100% Remote
We need 4-6 years of experience profile
We need strong communicators and they will have to pass an assessment given over Teams.
Qualifications:
• Strong written + verbal communication (will be asking these resources to work directly with project management, product management, stakeholders to gather requirements, work through testing/validations, etc. They need to be able to drive the project forward, work independently and be okay interacting with end-users and key stakeholders)
• Ability to speak to the outcomes they have driven within their domains through their queries
• SQL expert
• ETL/BI development experience
• Jenkins or other orchestration tool
• GCP
• BigQuery
• Python
• Dataform and/or DBT
• Must be a systemic-thinker
Nice to Have Skills & Experience
• Candidates coming from a newer company in the online space such as Chewy, Amazon, Wayfair, Netflix (doesn't have to be a Fortune100; a legit startup within this space is fine too)
• T.A. (Teaching Assistant) experience in grad school
• Experience working with really high streaming/high velocity/fancy technologies (seeing that in action at a decent-sized company is great background experience)
Job Description
We are looking to hire elite Data Engineers to join their Enterprise Data Warehouse Team. This team is responsible for supporting and providing all 12 company business units with the data they need to optimize their BUs and increase efficiency/profitability. This company has a centralized data organization and is Google's largest customer. There are over 200 people that report to this director, and his teams are responsible for working with over 170 petabytes of data within BigQuery. This Data Engineer will be responsible for legacy redesigns, net new projects within all 12 business units, and will be using cutting-edge technology for all of it. The ideal candidate for this role can vary in years of experience but must be able to think systemically about computer science instead of just being able to code. For example, there are a hundred ways to cut code, but these candidates must understand what the trade-offs are of doing it each way and be able to explain why they would decide which way they chose to cut it. We are looking for someone with a growth mindset who is able to reflect on their prior work and think of what they would do differently next time to improve it. The overarching goal of this team is not just build a data set, but to build one that is robust using processes and techniques that will allow to work just as well on day 100 as it would on day 1.
+1 571-556-1002 |naresh@sapphiresoftwaresolutions.com
Hiring GCP Data Engineer -Only W2 || 100% Remote
Hi Folks, Please check the JD and share your updated resume
GCP Data Engineer
1 year contract
100% Remote
We need 4-6 years of experience profile
We need strong communicators and they will have to pass an assessment given over Teams.
Qualifications:
• Strong written + verbal communication (will be asking these resources to work directly with project management, product management, stakeholders to gather requirements, work through testing/validations, etc. They need to be able to drive the project forward, work independently and be okay interacting with end-users and key stakeholders)
• Ability to speak to the outcomes they have driven within their domains through their queries
• SQL expert
• ETL/BI development experience
• Jenkins or other orchestration tool
• GCP
• BigQuery
• Python
• Dataform and/or DBT
• Must be a systemic-thinker
Nice to Have Skills & Experience
• Candidates coming from a newer company in the online space such as Chewy, Amazon, Wayfair, Netflix (doesn't have to be a Fortune100; a legit startup within this space is fine too)
• T.A. (Teaching Assistant) experience in grad school
• Experience working with really high streaming/high velocity/fancy technologies (seeing that in action at a decent-sized company is great background experience)
Job Description
We are looking to hire elite Data Engineers to join their Enterprise Data Warehouse Team. This team is responsible for supporting and providing all 12 company business units with the data they need to optimize their BUs and increase efficiency/profitability. This company has a centralized data organization and is Google's largest customer. There are over 200 people that report to this director, and his teams are responsible for working with over 170 petabytes of data within BigQuery. This Data Engineer will be responsible for legacy redesigns, net new projects within all 12 business units, and will be using cutting-edge technology for all of it. The ideal candidate for this role can vary in years of experience but must be able to think systemically about computer science instead of just being able to code. For example, there are a hundred ways to cut code, but these candidates must understand what the trade-offs are of doing it each way and be able to explain why they would decide which way they chose to cut it. We are looking for someone with a growth mindset who is able to reflect on their prior work and think of what they would do differently next time to improve it. The overarching goal of this team is not just build a data set, but to build one that is robust using processes and techniques that will allow to work just as well on day 100 as it would on day 1.






