GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "X months" and a pay rate of "$Y per hour." Key skills include ETL/DWH experience, SQL proficiency, and familiarity with Snowflake or BigQuery. Azure experience is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 18, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#Batch #Automation #Data Quality #SQL (Structured Query Language) #Data Management #BigQuery #Azure #Data Engineering #Data Orchestration #Azure cloud #Snowflake #BI (Business Intelligence) #SQL Queries #Collibra #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Cloud
Role description
Experience in ETL/DWH and BI reports testing. Proficient in SQL. Capability to review snowflake stored procedures and data orchestration. Experience in building test automation frameworks and creating automation scripts. Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form. Knowledge of Data Quality Essentials, data management fundamentals and ETL concepts. Experience with any of cloud systems – GCP Experience with Snowflake or BigQuery platform. Experience with data quality tools like Collibra data quality. Experience with Azure cloud. Expertise in testing both batch & real-time data using SQL queries.