

Techvy Corp
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a 6+ month W2 contract in New York, NY (hybrid, 3 days onsite). Requires 6+ years of Data Engineering experience, 2+ years in GCP, expertise in NoSQL, SQL, Python, and data orchestration tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#Airflow #Data Integration #Storage #Terraform #Cloud #Security #Monitoring #Apache Beam #BI (Business Intelligence) #Automation #Data Pipeline #Infrastructure as Code (IaC) #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Data Modeling #Dataflow #Python #SQL (Structured Query Language) #Looker #Data Analysis #Scala #Data Orchestration #BigQuery #Data Engineering #MongoDB #NoSQL #GCP (Google Cloud Platform) #Data Security
Role description
Hiring Now: GCP Data Engineer | Hybrid – New York, NY
Location: New York (Hybrid – 3 days onsite per week)
Job Type: W2 Contract
Start Date: ASAP
Duration: 6+ Months (with possible extension)
About the Role:
We are looking for a highly skilled GCP Data Engineer to join our team in New York, NY. This hybrid position requires 3 days onsite per week and involves building scalable, secure, and high-performance data solutions using Google Cloud Platform (GCP) technologies.
Key Responsibilities:
• Design and implement large-scale data pipelines using GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions
• Develop ETL/ELT workflows and automate data integration processes
• Collaborate with data analysts, scientists, and product teams to deliver clean, reliable data
• Optimize performance and cost of data infrastructure
• Ensure data security, monitoring, and governance
• Use tools like Apache Beam, Airflow (Cloud Composer), and Terraform for orchestration and automation
Required Qualifications:
• 6+ years of experience in Data Engineering, including 2+ years with Google Cloud Platform
• Need strong expertise in NoSql DB like MongoDB
• Proficiency in SQL, Python, and Apache Beam
• Hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage
• Familiarity with data orchestration tools (e.g., Cloud Composer / Airflow)
• Experience with CI/CD and Infrastructure as Code (e.g., Terraform)
• Strong understanding of data warehousing, data modeling, and performance tuning
Preferred Qualifications:
• GCP Professional Data Engineer Certification
• Experience with real-time streaming data solutions
• Exposure to Looker, dbt, or other analytics/BI tools
• Background in healthcare, finance, or regulated environments is a plus
Hiring Now: GCP Data Engineer | Hybrid – New York, NY
Location: New York (Hybrid – 3 days onsite per week)
Job Type: W2 Contract
Start Date: ASAP
Duration: 6+ Months (with possible extension)
About the Role:
We are looking for a highly skilled GCP Data Engineer to join our team in New York, NY. This hybrid position requires 3 days onsite per week and involves building scalable, secure, and high-performance data solutions using Google Cloud Platform (GCP) technologies.
Key Responsibilities:
• Design and implement large-scale data pipelines using GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions
• Develop ETL/ELT workflows and automate data integration processes
• Collaborate with data analysts, scientists, and product teams to deliver clean, reliable data
• Optimize performance and cost of data infrastructure
• Ensure data security, monitoring, and governance
• Use tools like Apache Beam, Airflow (Cloud Composer), and Terraform for orchestration and automation
Required Qualifications:
• 6+ years of experience in Data Engineering, including 2+ years with Google Cloud Platform
• Need strong expertise in NoSql DB like MongoDB
• Proficiency in SQL, Python, and Apache Beam
• Hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage
• Familiarity with data orchestration tools (e.g., Cloud Composer / Airflow)
• Experience with CI/CD and Infrastructure as Code (e.g., Terraform)
• Strong understanding of data warehousing, data modeling, and performance tuning
Preferred Qualifications:
• GCP Professional Data Engineer Certification
• Experience with real-time streaming data solutions
• Exposure to Looker, dbt, or other analytics/BI tools
• Background in healthcare, finance, or regulated environments is a plus






