

Lorven Technologies Inc.
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer, a full-time contract position lasting over 6 months, with a pay rate of "unknown." It requires strong GCP expertise, 6+ years of data engineering experience, and skills in Python/Java, BigQuery, and Apache Spark. Remote work with bi-monthly onsite presence in Denver, CO.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Denver, CO
-
🧠 - Skills detailed
#Scrum #Security #Storage #Java #Data Ingestion #Deployment #Apache Spark #Agile #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #BI (Business Intelligence) #BigQuery #GCP (Google Cloud Platform) #Compliance #Infrastructure as Code (IaC) #Data Pipeline #Data Processing #Scala #Data Science #Dataflow #Data Engineering #Batch #Data Modeling #Data Governance #SQL (Structured Query Language) #Automation #Data Quality #Terraform #Python #Airflow #Looker #ML (Machine Learning) #Apache Beam #Databases #Programming #Cloud
Role description
Hi,
Our client is looking for GCP Data Engineer at Remote (But Monthly twice onsite at Denver, CO) below is the detailed requirements. Please share your updated resume if you are interested.
Job Title: GCP Data Engineer
Location: Denver, CO (Remote)
Employment Type: Full-Time / Contract
Job Description
• We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO.
• The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP).
• This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.
Key Responsibilities
• Design, develop, and maintain ETL/ELT data pipelines on GCP
• Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
• Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
• Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
• Ensure data quality, reliability, performance, and security best practices
• Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
• Optimize query performance and cost management in BigQuery
• Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
• Monitor and troubleshoot data pipelines in production environments
• Follow data governance, compliance, and security standards
Required Skills & Qualifications
• 6+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
• Strong programming skills in Python and/or Java
• Experience with Apache Spark, Apache Beam
• Solid understanding of data warehousing, data modeling, and SQL
• Experience with Airflow / Cloud Composer
• Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
• Experience working in Agile/Scrum environments
Nice to Have
• GCP Professional Data Engineer Certification
• Experience with real-time/streaming data pipelines
• Knowledge of machine learning data pipelines on GCP
• Experience with Looker or other BI tools
• Healthcare, Finance, or Retail domain experience
Hi,
Our client is looking for GCP Data Engineer at Remote (But Monthly twice onsite at Denver, CO) below is the detailed requirements. Please share your updated resume if you are interested.
Job Title: GCP Data Engineer
Location: Denver, CO (Remote)
Employment Type: Full-Time / Contract
Job Description
• We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO.
• The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP).
• This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.
Key Responsibilities
• Design, develop, and maintain ETL/ELT data pipelines on GCP
• Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
• Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
• Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
• Ensure data quality, reliability, performance, and security best practices
• Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
• Optimize query performance and cost management in BigQuery
• Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
• Monitor and troubleshoot data pipelines in production environments
• Follow data governance, compliance, and security standards
Required Skills & Qualifications
• 6+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
• Strong programming skills in Python and/or Java
• Experience with Apache Spark, Apache Beam
• Solid understanding of data warehousing, data modeling, and SQL
• Experience with Airflow / Cloud Composer
• Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
• Experience working in Agile/Scrum environments
Nice to Have
• GCP Professional Data Engineer Certification
• Experience with real-time/streaming data pipelines
• Knowledge of machine learning data pipelines on GCP
• Experience with Looker or other BI tools
• Healthcare, Finance, or Retail domain experience





