

Lorven Technologies Inc.
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "unknown" at a pay rate of "unknown," located in Denver, CO (Remote). Requires 6+ years in Data Engineering, expertise in GCP services, and strong programming skills in Python/Java.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 24, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Denver, CO
-
🧠 - Skills detailed
#Apache Beam #Dataflow #Data Science #Automation #Storage #Data Ingestion #BigQuery #Data Modeling #Java #Programming #ML (Machine Learning) #Python #Databases #Scala #Cloud #Agile #Data Pipeline #Batch #"ETL (Extract #Transform #Load)" #Scrum #Apache Spark #BI (Business Intelligence) #Security #Compliance #Data Processing #Deployment #Data Governance #Data Quality #GCP (Google Cloud Platform) #Data Engineering #SQL (Structured Query Language) #Infrastructure as Code (IaC) #Terraform #Airflow #Looker #Spark (Apache Spark)
Role description
Hi,
Our client is looking for GCP Data Engineer at Denver, CO (Remote) below is the detailed requirements. Please share your updated resume if you are interested.
Job Title: GCP Data Engineer
Location: Denver, CO (Remote)
Employment Type: Contract - W2
Job Description
We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP). This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.
Key Responsibilities
• Design, develop, and maintain ETL/ELT data pipelines on GCP
• Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
• Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
• Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
• Ensure data quality, reliability, performance, and security best practices
• Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
• Optimize query performance and cost management in BigQuery
• Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
• Monitor and troubleshoot data pipelines in production environments
• Follow data governance, compliance, and security standards
Required Skills & Qualifications
• 6+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
• Strong programming skills in Python and/or Java
• Experience with Apache Spark, Apache Beam
• Solid understanding of data warehousing, data modeling, and SQL
• Experience with Airflow / Cloud Composer
• Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
• Experience working in Agile/Scrum environments
Nice to Have
• GCP Professional Data Engineer Certification
• Experience with real-time/streaming data pipelines
• Knowledge of machine learning data pipelines on GCP
• Experience with Looker or other BI tools
• Healthcare, Finance, or Retail domain experience
Hi,
Our client is looking for GCP Data Engineer at Denver, CO (Remote) below is the detailed requirements. Please share your updated resume if you are interested.
Job Title: GCP Data Engineer
Location: Denver, CO (Remote)
Employment Type: Contract - W2
Job Description
We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP). This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.
Key Responsibilities
• Design, develop, and maintain ETL/ELT data pipelines on GCP
• Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
• Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
• Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
• Ensure data quality, reliability, performance, and security best practices
• Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
• Optimize query performance and cost management in BigQuery
• Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
• Monitor and troubleshoot data pipelines in production environments
• Follow data governance, compliance, and security standards
Required Skills & Qualifications
• 6+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
• Strong programming skills in Python and/or Java
• Experience with Apache Spark, Apache Beam
• Solid understanding of data warehousing, data modeling, and SQL
• Experience with Airflow / Cloud Composer
• Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
• Experience working in Agile/Scrum environments
Nice to Have
• GCP Professional Data Engineer Certification
• Experience with real-time/streaming data pipelines
• Knowledge of machine learning data pipelines on GCP
• Experience with Looker or other BI tools
• Healthcare, Finance, or Retail domain experience






