

GCP-Certified Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP-Certified Data Engineer in New York, hybrid for 6 months, offering competitive pay. Requires 5+ years in cloud data engineering, experience migrating from Snowflake to BigQuery, and strong skills in Python, SQL, and ETL/ELT processes.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
960
-
ποΈ - Date discovered
May 22, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#BigQuery #Data Quality #Batch #Python #SQL (Structured Query Language) #Cloud #GCP (Google Cloud Platform) #Data Engineering #Data Ingestion #Data Migration #Snowflake #"ETL (Extract #Transform #Load)" #Scala #Infrastructure as Code (IaC) #Storage #Clustering #Data Pipeline #Airflow #Dataflow #Monitoring #Observability #Apache Beam #Datasets #Migration #Apache Airflow #Terraform #AWS (Amazon Web Services)
Role description
Job Title: GCP-Certified Data Engineer
Location: New York Job Type: Hybrid (3 days in office) Experience Level: Senior (5+ Years)
We are seeking a GCP-Certified Data Engineer with 5+ years of hands-on experience in cloud data engineering, ideally with direct experience migrating from Snowflake to BigQuery. This role is key to modernizing and scaling our data infrastructure, ensuring robust data ingestion, transformation, and performance optimization using Google Cloud's native tools.
Key Responsibilities:
Design, develop, and optimize scalable ETL/ELT pipelines using Apache Beam (Dataflow) and Pub/Sub
Orchestrate complex data workflows using Cloud Composer (Apache Airflow)
Lead or support large-scale data migrations from AWS/Snowflake to BigQuery, including schema mapping and performance tuning
Enhance BigQuery performance through strategic use of partitioning, clustering, and effective resource management
Implement rigorous data quality frameworks, validation checks, and ensure pipeline observability and monitoring
Partner with analytics, product, and business teams to understand data needs and deliver timely, reliable data solutions
Required Skills and Experience:
GCP Certified (Professional Data Engineer preferred)
5+ years of experience in cloud data engineering, including real-time and batch processing
Strong proficiency in Python and SQL
Deep understanding of BigQuery, Dataflow, Pub/Sub, and Cloud Storage
Experience with Cloud Composer (Airflow) for orchestration
Prior experience with ETL/ELT migrations, particularly from Snowflake to GCP
Proven track record in performance optimization and managing large datasets (structured & semi-structured)
Familiarity with Terraform or Infrastructure as Code (IaC)
Experience with CI/CD for data pipelines
Knowledge of AWS services and multi-cloud data strategies
Job Title: GCP-Certified Data Engineer
Location: New York Job Type: Hybrid (3 days in office) Experience Level: Senior (5+ Years)
We are seeking a GCP-Certified Data Engineer with 5+ years of hands-on experience in cloud data engineering, ideally with direct experience migrating from Snowflake to BigQuery. This role is key to modernizing and scaling our data infrastructure, ensuring robust data ingestion, transformation, and performance optimization using Google Cloud's native tools.
Key Responsibilities:
Design, develop, and optimize scalable ETL/ELT pipelines using Apache Beam (Dataflow) and Pub/Sub
Orchestrate complex data workflows using Cloud Composer (Apache Airflow)
Lead or support large-scale data migrations from AWS/Snowflake to BigQuery, including schema mapping and performance tuning
Enhance BigQuery performance through strategic use of partitioning, clustering, and effective resource management
Implement rigorous data quality frameworks, validation checks, and ensure pipeline observability and monitoring
Partner with analytics, product, and business teams to understand data needs and deliver timely, reliable data solutions
Required Skills and Experience:
GCP Certified (Professional Data Engineer preferred)
5+ years of experience in cloud data engineering, including real-time and batch processing
Strong proficiency in Python and SQL
Deep understanding of BigQuery, Dataflow, Pub/Sub, and Cloud Storage
Experience with Cloud Composer (Airflow) for orchestration
Prior experience with ETL/ELT migrations, particularly from Snowflake to GCP
Proven track record in performance optimization and managing large datasets (structured & semi-structured)
Familiarity with Terraform or Infrastructure as Code (IaC)
Experience with CI/CD for data pipelines
Knowledge of AWS services and multi-cloud data strategies