Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 4-month contract in San Jose, CA, offering $54-64/hr. Key skills include LookML, Google Cloud Platform (GCP), SQL, and ETL development. Experience with BigQuery and Looker API/SDK is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
512
-
πŸ—“οΈ - Date discovered
September 3, 2025
πŸ•’ - Project duration
3 to 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Data Architecture #GCP (Google Cloud Platform) #Automation #Looker #API (Application Programming Interface) #Data Modeling #Spark (Apache Spark) #Visualization #SQL (Structured Query Language) #Data Extraction #BigQuery #Cloud #Scala #Data Engineering #Hadoop #"ETL (Extract #Transform #Load)" #SQL Queries
Role description
Title: Data Engineer Duration: 4 month contract/ extensions Pay Range: 54-64/hr Location: San Jose, CA JOB DESCRIPTION We are seeking a highly skilled Data Engineer with deep expertise in LookML and Google Cloud Platform (GCP) to join our data team. This role is ideal for someone who can design scalable data architectures, build intuitive dashboards, and collaborate cross-functionally to deliver impactful analytics solutions. - Design, develop, test, and maintain ETL pipelines and data solutions within GCP, with a focus on BigQuery. - Write and optimize complex SQL queries for data extraction, transformation, and analysis. - Discover and integrate raw data from diverse sources into consistent, scalable architectures. - Manage cloud data ingress and egress using tools such as gcloud, gsutil, and bq command-line utilities. - Translate business requirements into technical specifications and data models. - Build and maintain LookML views, explores, dashboards, and persistent derived tables. - Design clear, intuitive, and impactful dashboards following visualization best practices. - Utilize Looker’s API and SDK for automation, embedding, and advanced integrations. - Collaborate with cross-functional teams to support data-driven decision-making. REQUIRED SKILLS AND EXPERIENCE Required Skills & Qualifications: - Proven experience with Google Cloud Platform, especially BigQuery. Strong proficiency in SQL and data modeling. - Hands-on experience with ETL development and cloud data architecture. - Advanced skills in LookML and Looker dashboard development. - Familiarity with GCP command-line tools (gcloud, gsutil, bq). - Experience with Looker API/SDK for automation and embedding. - Excellent analytical, problem-solving, and communication skills. - Ability to work effectively in a fast-paced, dynamic environment. NICE TO HAVE SKILLS AND EXPERIENCE - Leverage experience with Hadoop, Spark/Hive, and UC4 scheduling tools.