

Kanak Elite Services
W2 Remote Role :: GCP Cloud Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a W2 Remote GCP Cloud Data Engineer contract for 7+ months, offering competitive pay. Key skills include GCP components, Python, SQL, Git, CI/CD, and Terraform. Requires strong data pipeline development and collaboration abilities.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Terraform #Cloud #GIT #Data Architecture #Databases #BigQuery #GCP (Google Cloud Platform) #Airflow #Data Pipeline #Dataflow #Python #Data Engineering #SQL (Structured Query Language) #Infrastructure as Code (IaC) #API (Application Programming Interface)
Role description
Role :: Google Cloud Data Engineer
Location :: REMOTE (USA)
Duration :: Contract(W2 Only)
GCP Cloud Data Engineer 7+
Data Components: Airflow/Composer, Dataflow, Dataproc, BigQuery, Cloud Functions, Cloud Databases, API Gateway / Apigee 3+
Python (Airflow operators and DAGs) -3+
SQL -7+Git 3+
CI/CD pipelines 5+
Terraform for infrastructure as code 3+
Required
• Strong familiarity with Google Cloud Platform (GCP) data components:
• Airflow/Composer
• Dataflow
• Dataproc
• BigQuery
• Cloud Functions
• Cloud Databases
• API Gateway / Apigee
• Advanced proficiency in:
• Python (Airflow operators and DAGs)
• SQL
• Git
• CI/CD pipelines
• Working knowledge of Terraform for infrastructure as code.
• Skills:
• Able to work independently while aligning with team standards and architecture.
• Quick learner with strong coding fundamentals.
• Comfortable in a fast-paced, delivery-focused environment.
Responsibilities
• Collaborate semi-independently within an established technical vision to build and optimize data pipelines.
• Rapidly contribute to development efforts with minimal ramp-up time.
• Work closely with internal teams to ensure alignment with data architecture standards and delivery timelines.
Role :: Google Cloud Data Engineer
Location :: REMOTE (USA)
Duration :: Contract(W2 Only)
GCP Cloud Data Engineer 7+
Data Components: Airflow/Composer, Dataflow, Dataproc, BigQuery, Cloud Functions, Cloud Databases, API Gateway / Apigee 3+
Python (Airflow operators and DAGs) -3+
SQL -7+Git 3+
CI/CD pipelines 5+
Terraform for infrastructure as code 3+
Required
• Strong familiarity with Google Cloud Platform (GCP) data components:
• Airflow/Composer
• Dataflow
• Dataproc
• BigQuery
• Cloud Functions
• Cloud Databases
• API Gateway / Apigee
• Advanced proficiency in:
• Python (Airflow operators and DAGs)
• SQL
• Git
• CI/CD pipelines
• Working knowledge of Terraform for infrastructure as code.
• Skills:
• Able to work independently while aligning with team standards and architecture.
• Quick learner with strong coding fundamentals.
• Comfortable in a fast-paced, delivery-focused environment.
Responsibilities
• Collaborate semi-independently within an established technical vision to build and optimize data pipelines.
• Rapidly contribute to development efforts with minimal ramp-up time.
• Work closely with internal teams to ensure alignment with data architecture standards and delivery timelines.






