Acumenz Consulting

W2 /Own Corp Contract Position Big Data Engineer (GCP & BigQuery Specialist) | New Jersey – Onsite – Need Locals Only Visa : USC /GC/H4EAD Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Big Data Engineer (GCP & BigQuery Specialist) position based in New Jersey, onsite, for locals only. Contract length is unspecified, with pay rate undisclosed. Requires strong GCP and BigQuery expertise, advanced SQL skills, and experience with data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Corp-to-Corp (C2C)
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#API (Application Programming Interface) #Storage #IAM (Identity and Access Management) #Data Processing #Scala #SQL (Structured Query Language) #Airflow #"ETL (Extract #Transform #Load)" #Data Architecture #SQL Queries #Data Modeling #Data Engineering #Snowflake #Clustering #Data Lake #Data Quality #Python #Data Science #GCP (Google Cloud Platform) #Big Data #Datasets #Cloud #Data Warehouse #BigQuery #Dataflow #JSON (JavaScript Object Notation) #Data Pipeline #Security #Batch
Role description
Job Title: Big Data Engineer (GCP & BigQuery Specialist) Location: New Jersey – Onsite – Need locals only Visa : USC /GC/H4EAD Only Job Summary We are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) and BigQuery to design, develop, and maintain scalable data solutions. The ideal candidate will have hands-on experience building data pipelines, optimizing large-scale datasets, and implementing cloud-native data architectures. \_\_\_\_\_\_\_\_\_\_\_\_\_\_ Key Responsibilities • Design, develop, and maintain scalable data pipelines on GCP • Build and optimize data warehouses using BigQuery • Implement ETL/ELT processes using tools such as: o Cloud Dataflow o Cloud Composer (Airflow) o Cloud Functions • Develop batch and real-time data processing solutions • Optimize SQL queries and improve BigQuery performance and cost efficiency • Implement data modeling techniques (star schema, snowflake schema, partitioning, clustering) • Work with structured and semi-structured data (JSON, Avro, Parquet) • Ensure data quality, governance, and security best practices • Collaborate with data scientists, analysts, and business stakeholders • Monitor pipelines and troubleshoot production issues • Implement CI/CD for data workflows \_\_\_\_\_\_\_\_\_\_\_\_\_\_ Required Skills & Qualifications Technical Skills • Strong experience with Google Cloud Platform (GCP) • Expert-level knowledge of BigQuery • Proficiency in SQL (advanced query optimization) • Hands-on experience with: o Dataflow o Pub/Sub o Cloud Storage o Cloud Composer (Airflow) • Experience with Python and/or Scala • Experience designing data lakes and data warehouses • Familiarity with IAM, security, and access control in GCP • Experience with API integrations and streaming data