

CDI Solutions
GCP Data Engineer Only W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 10+ years of experience, focusing on Python, Django, and Apache Spark. It offers a long-term contract in Charlotte, NC (hybrid), with a pay rate of "unknown." GCP certification is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Data Pipeline #Python #REST (Representational State Transfer) #Data Processing #Data Engineering #Version Control #REST API #Dataflow #GitHub #Monitoring #Django #Security #Logging #Microservices #Batch #GitLab #Docker #Kubernetes #Apache Spark #Cloud #GCP (Google Cloud Platform) #Data Modeling #Automated Testing #Terraform #Infrastructure as Code (IaC) #Spark (Apache Spark) #Storage #BigQuery #Linux #Scala #PySpark
Role description
Hello
One of my Clients is Hiring for GCP Data Engineer at Charlotte, NC-Hybrid for Long term Contract.
Job: GCP Data Engineer
Location: Charlotte, NC (3days Hybrid)
Type : Long Term Contract
Overall 10 + years exp for a highly experienced Google Cloud Engineer with deep expertise in Python, Django, Spark, and Google Cloud Platform (GCP) to design, build, and optimize scalable, cloud-native data and analytics solutions.
Key Responsibilities
Design, develop, and maintain cloud-native backend applications using Python and Django on GCP
Build and optimize Apache Spark / PySpark pipelines for large-scale batch and streaming data processing
Architect end-to-end solutions leveraging GCP services such as Compute Engine, GKE, Cloud Run, BigQuery, Dataproc, Dataflow, Cloud Storage, and Pub/Sub
Develop secure, scalable RESTful APIs and microservices using Django and Django REST Framework
Ensure high availability, performance, and scalability of applications and data pipelines
Implement CI/CD pipelines and Infrastructure as Code (Terraform, Cloud Build, GitHub/GitLab)
Apply best practices for security, authentication, authorization, and data protection
Monitor, troubleshoot, and optimize production workloads using Cloud Monitoring and Logging
Collaborate with product managers, data engineers, and platform teams to deliver business-critical solutions
Required Skills & Qualifications
10+ years of professional experience in software or cloud engineering
Strong hands-on experience with Google Cloud Platform (GCP)
Extensive experience with Django and Django REST Framework
Strong hands-on experience with Apache Spark / PySpark
Solid understanding of distributed systems, microservices architecture, and REST APIs
Experience with BigQuery, SQL optimization, and data modeling
Experience with Dataproc, Dataflow, or managed Spark environments
Proficiency with Docker, Kubernetes (GKE), and Linux
Experience with CI/CD, version control, and automated testing
Preferred Qualifications
GCP certifications
======================================
Hello
One of my Clients is Hiring for GCP Data Engineer at Charlotte, NC-Hybrid for Long term Contract.
Job: GCP Data Engineer
Location: Charlotte, NC (3days Hybrid)
Type : Long Term Contract
Overall 10 + years exp for a highly experienced Google Cloud Engineer with deep expertise in Python, Django, Spark, and Google Cloud Platform (GCP) to design, build, and optimize scalable, cloud-native data and analytics solutions.
Key Responsibilities
Design, develop, and maintain cloud-native backend applications using Python and Django on GCP
Build and optimize Apache Spark / PySpark pipelines for large-scale batch and streaming data processing
Architect end-to-end solutions leveraging GCP services such as Compute Engine, GKE, Cloud Run, BigQuery, Dataproc, Dataflow, Cloud Storage, and Pub/Sub
Develop secure, scalable RESTful APIs and microservices using Django and Django REST Framework
Ensure high availability, performance, and scalability of applications and data pipelines
Implement CI/CD pipelines and Infrastructure as Code (Terraform, Cloud Build, GitHub/GitLab)
Apply best practices for security, authentication, authorization, and data protection
Monitor, troubleshoot, and optimize production workloads using Cloud Monitoring and Logging
Collaborate with product managers, data engineers, and platform teams to deliver business-critical solutions
Required Skills & Qualifications
10+ years of professional experience in software or cloud engineering
Strong hands-on experience with Google Cloud Platform (GCP)
Extensive experience with Django and Django REST Framework
Strong hands-on experience with Apache Spark / PySpark
Solid understanding of distributed systems, microservices architecture, and REST APIs
Experience with BigQuery, SQL optimization, and data modeling
Experience with Dataproc, Dataflow, or managed Spark environments
Proficiency with Docker, Kubernetes (GKE), and Linux
Experience with CI/CD, version control, and automated testing
Preferred Qualifications
GCP certifications
======================================





