Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date discovered
September 9, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Scala #Data Architecture #Computer Science #Data Modeling #Hadoop #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Agile #GCP (Google Cloud Platform) #Looker #Cloud #BigQuery #Data Engineering #Data Extraction #SQL Queries #Data Quality
Role description
πŸ“Œ Job Description: Data Engineer (GCP & BigQuery) We are seeking a highly skilled Data Engineer with strong expertise in Google Cloud Platform (GCP), particularly BigQuery. This role is ideal for someone who thrives in a fast-paced environment, enjoys solving complex data challenges, and can translate business needs into scalable technical solutions. Key Responsibilities β€’ Design, develop, test, and maintain ETL pipelines and data solutions within GCP, focusing on BigQuery. β€’ Write and optimize complex SQL queries for data extraction, transformation, and analysis. β€’ Discover and integrate raw data from diverse sources into consistent, scalable architectures. β€’ Manage cloud data ingress and egress using tools such as gcloud, gsutil, and bq command-line utilities. β€’ Collaborate with business stakeholders to understand analytical needs and translate them into technical requirements. β€’ Ensure data quality, integrity, and performance across all data processes. β€’ Work closely with cross-functional teams to support data-driven decision-making. β€’ Utilize LookML to build and maintain views, explores, dashboards, and persistent derived tables. βœ… Required Skills & Experience β€’ Proven experience with Google Cloud Platform, especially BigQuery. β€’ Strong proficiency in SQL and data modeling. β€’ Hands-on experience with ETL development and cloud data architecture. β€’ Familiarity with GCP command-line tools (gcloud, gsutil, bq). β€’ Excellent analytical and problem-solving skills. β€’ Strong communication and collaboration abilities. β€’ Experience with Looker and LookML development. β€’ (Preferred) Experience with Hadoop, Spark/Hive, and UC4. ✨ Nice to Have β€’ Experience with Hadoop, Spark/Hive, and UC4 scheduling tools. β€’ Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field. β€’ Experience working in agile environments and with cross-functional teams. Compensation: $55-$60/hour