TEK NINJAS

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Engineer on a 12-month contract, hybrid location (onsite 2–3 days/week), with a pay rate of "unknown." Requires 10+ years in data engineering, 5+ years in GCP, and telecom or automotive data experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 13, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Deployment #Cloud #Data Security #IAM (Identity and Access Management) #ML (Machine Learning) #BigQuery #Dataflow #GIT #Data Engineering #PySpark #Data Quality #Telematics #Data Processing #Apache Beam #Data Lake #Hadoop #Airflow #GCP (Google Cloud Platform) #SQL (Structured Query Language) #DevOps #Batch #"ETL (Extract #Transform #Load)" #Terraform #Python #IoT (Internet of Things) #Datasets #AI (Artificial Intelligence) #Data Pipeline #Security #Spark (Apache Spark) #Monitoring #Data Science #Automation
Role description
Role: Senior GCP Data Engineer Location: Hybrid – Onsite 2–3 days/week Type: 12-Month Contract Experience: 10+ Years Total | 5+ Years in GCP Data Engineering Key Responsibilities • Design, develop, and maintain data pipelines and ETL workflows using GCP services such as Dataflow, Dataproc, BigQuery, Cloud Composer, and Pub/Sub. • Build data lake and warehouse architectures optimized for telecom network data, IoT sensor data, and vehicle telematics. • Implement streaming and batch data processing for high-volume telecom and automotive datasets. • Develop data models and analytics frameworks supporting network optimization, predictive maintenance, connected-car analytics, and customer experience insights. • Collaborate with data scientists, ML engineers, and business teams to enable advanced analytics and machine learning workloads. • Ensure data quality, security, and governance across all GCP environments. • Automate deployment and monitoring using Terraform, Cloud Build, and CI/CD pipelines in GCP DevOps. Required Skills • 10+ years of total experience in data engineering, with 5+ years on GCP. • Strong hands-on experience with BigQuery, Dataflow (Apache Beam), Dataproc (Spark/Hadoop), Cloud Composer (Airflow), and Pub/Sub. • Proficiency in Python, SQL, and PySpark for data transformation and orchestration. • Deep understanding of data lake, warehouse, and streaming architectures. • Experience working with telecom OSS/BSS data, network telemetry, or automotive IoT/vehicle data. • Knowledge of data security, IAM, and governance frameworks within GCP. Preferred Skills • Familiarity with Machine Learning pipelines (Vertex AI, AI Platform). • Experience integrating real-time event data from connected devices or network systems. • Hands-on experience with Terraform, Git, and CI/CD automation in GCP. • GCP Certifications (e.g., Professional Data Engineer, Professional Cloud Architect) are a plus.