

GCP Data Engineer (contract)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a GCP Data Engineer (contract) for up to "X months" with a pay rate of $49.83 - $77.86/hour. Key skills include SQL, Python, R, and GCP services. Experience in building automated ETL pipelines is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
616
-
ποΈ - Date discovered
August 19, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Sunnyvale, CA
-
π§ - Skills detailed
#Security #Data Pipeline #Programming #Data Science #Infrastructure as Code (IaC) #R #Tableau #Consulting #Google Data Studio #BigQuery #Data Engineering #React #BI (Business Intelligence) #Data Access #Dataflow #GCP (Google Cloud Platform) #Data Security #Looker #Storage #Terraform #Batch #Scala #Data Wrangling #Automation #ML (Machine Learning) #DevOps #Cloud #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Analysis
Role description
We are seeking a GCP Data Engineer to design, optimize, and deliver scalable data solutions that enable advanced analytics and data-driven decision-making. This role requires deep technical expertise in SQL, Python, R, and Google Cloud Platform (GCP) services. You will be responsible for building automated ETL pipelines, developing dashboards, and collaborating with stakeholders to translate business needs into actionable insights.
Key Responsibilities
β’ Perform advanced data analysis using SQL, Python, and R to identify trends, patterns, and insights.
β’ Partner with stakeholders to convert business requirements into analytical and technical solutions.
β’ Design, develop, and maintain ETL pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Composer.
β’ Build and deploy real-time dashboards with Looker, Google Data Studio, or similar BI tools.
β’ Optimize and streamline workflows to improve data accessibility, scalability, and reliability.
β’ Drive automation and process improvements across data engineering operations.
β’ Deliver high-quality outputs under tight deadlines in a fast-paced, high-impact environment.
β’ Clearly communicate technical findings and recommendations to both technical and non-technical audiences.
Technical Profile
β’ Proficiency in SQL for advanced querying, data transformation, and performance tuning.
β’ Intermediate to advanced programming in Python and R for data wrangling and statistical analysis.
β’ Hands-on experience with GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Cloud Functions.
β’ Expertise in building automated ETL pipelines and handling both batch and streaming data.
β’ Experience with BI/dashboarding tools such as Looker, Tableau, or Google Data Studio.
β’ Familiarity with CI/CD pipelines and Infrastructure as Code (e.g., Terraform).
β’ Knowledge of data science or machine learning concepts is a plus.
Functional Profile
β’ Strong problem-solving skills with the ability to design scalable and efficient solutions.
β’ Ability to manage multiple priorities in a high-demand environment.
β’ Experience working cross-functionally to align business requirements with data engineering solutions.
β’ Strong stakeholder management, collaboration, and communication skills.
β’ Ability to deliver clear, actionable recommendations to both business and technical teams.
Required Qualifications
β’ Proficiency in SQL for complex querying and transformation
β’ Intermediate experience with Python and R for data analysis
β’ Hands-on expertise with GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Functions)
β’ Experience in building automated ETL pipelines and streaming data workflows
β’ Familiarity with dashboarding tools (Looker, Tableau, Google Data Studio)
β’ Strong problem-solving and independent working capability
β’ Excellent communication and stakeholder management skills
Preferred Qualifications
β’ GCP Professional Data Engineer certification
β’ Experience with CI/CD pipelines and Infrastructure as Code (Terraform)
β’ Background in data science or machine learning
Skills Summary
Core Expertise:
Data engineering, ETL development, workflow optimization, real-time analytics, GCP ecosystem
Languages & Frameworks
SQL, Python, R
Reactive & Event-Driven Tools
Pub/Sub, Dataflow
Cloud & Containerization
Google Cloud Platform (BigQuery, Cloud Storage, Cloud Functions, Cloud Composer), Terraform (IaC)
Database & Messaging
BigQuery, SQL-based systems, streaming data pipelines
DevOps & CI/CD
CI/CD pipelines, Infrastructure as Code (Terraform)
Other Tools & Technologies
Looker, Tableau, Google Data Studio, dashboard development, workflow automation
Soft Skills
Problem-solving, communication, stakeholder management, adaptability, time management, collaboration
The pay range that the employer in good faith reasonably expects to pay for this position is $49.83/hour - $77.86/hour. Our benefits include medical, dental, vision and retirement benefits. Applications will be accepted on an ongoing basis. Tundra Technical Solutions is among North Americaβs leading providers of Staffing and Consulting Services. Our success and our clientsβ success are built on a foundation of service excellence. We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable law, including the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Unincorporated LA County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: client provided property, including hardware (both of which may include data) entrusted to you from theft, loss or damage; return all portable client computer hardware in your possession (including the data contained therein) upon completion of the assignment, and; maintain the confidentiality of client proprietary, confidential, or non-public information. In addition, job duties require access to secure and protected client information technology systems and related data security obligations.
We are seeking a GCP Data Engineer to design, optimize, and deliver scalable data solutions that enable advanced analytics and data-driven decision-making. This role requires deep technical expertise in SQL, Python, R, and Google Cloud Platform (GCP) services. You will be responsible for building automated ETL pipelines, developing dashboards, and collaborating with stakeholders to translate business needs into actionable insights.
Key Responsibilities
β’ Perform advanced data analysis using SQL, Python, and R to identify trends, patterns, and insights.
β’ Partner with stakeholders to convert business requirements into analytical and technical solutions.
β’ Design, develop, and maintain ETL pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Composer.
β’ Build and deploy real-time dashboards with Looker, Google Data Studio, or similar BI tools.
β’ Optimize and streamline workflows to improve data accessibility, scalability, and reliability.
β’ Drive automation and process improvements across data engineering operations.
β’ Deliver high-quality outputs under tight deadlines in a fast-paced, high-impact environment.
β’ Clearly communicate technical findings and recommendations to both technical and non-technical audiences.
Technical Profile
β’ Proficiency in SQL for advanced querying, data transformation, and performance tuning.
β’ Intermediate to advanced programming in Python and R for data wrangling and statistical analysis.
β’ Hands-on experience with GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Cloud Functions.
β’ Expertise in building automated ETL pipelines and handling both batch and streaming data.
β’ Experience with BI/dashboarding tools such as Looker, Tableau, or Google Data Studio.
β’ Familiarity with CI/CD pipelines and Infrastructure as Code (e.g., Terraform).
β’ Knowledge of data science or machine learning concepts is a plus.
Functional Profile
β’ Strong problem-solving skills with the ability to design scalable and efficient solutions.
β’ Ability to manage multiple priorities in a high-demand environment.
β’ Experience working cross-functionally to align business requirements with data engineering solutions.
β’ Strong stakeholder management, collaboration, and communication skills.
β’ Ability to deliver clear, actionable recommendations to both business and technical teams.
Required Qualifications
β’ Proficiency in SQL for complex querying and transformation
β’ Intermediate experience with Python and R for data analysis
β’ Hands-on expertise with GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Functions)
β’ Experience in building automated ETL pipelines and streaming data workflows
β’ Familiarity with dashboarding tools (Looker, Tableau, Google Data Studio)
β’ Strong problem-solving and independent working capability
β’ Excellent communication and stakeholder management skills
Preferred Qualifications
β’ GCP Professional Data Engineer certification
β’ Experience with CI/CD pipelines and Infrastructure as Code (Terraform)
β’ Background in data science or machine learning
Skills Summary
Core Expertise:
Data engineering, ETL development, workflow optimization, real-time analytics, GCP ecosystem
Languages & Frameworks
SQL, Python, R
Reactive & Event-Driven Tools
Pub/Sub, Dataflow
Cloud & Containerization
Google Cloud Platform (BigQuery, Cloud Storage, Cloud Functions, Cloud Composer), Terraform (IaC)
Database & Messaging
BigQuery, SQL-based systems, streaming data pipelines
DevOps & CI/CD
CI/CD pipelines, Infrastructure as Code (Terraform)
Other Tools & Technologies
Looker, Tableau, Google Data Studio, dashboard development, workflow automation
Soft Skills
Problem-solving, communication, stakeholder management, adaptability, time management, collaboration
The pay range that the employer in good faith reasonably expects to pay for this position is $49.83/hour - $77.86/hour. Our benefits include medical, dental, vision and retirement benefits. Applications will be accepted on an ongoing basis. Tundra Technical Solutions is among North Americaβs leading providers of Staffing and Consulting Services. Our success and our clientsβ success are built on a foundation of service excellence. We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable law, including the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Unincorporated LA County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: client provided property, including hardware (both of which may include data) entrusted to you from theft, loss or damage; return all portable client computer hardware in your possession (including the data contained therein) upon completion of the assignment, and; maintain the confidentiality of client proprietary, confidential, or non-public information. In addition, job duties require access to secure and protected client information technology systems and related data security obligations.