GCP Data Engineer - W2 Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in the Telecom industry, based in Irving, TX, for 6+ months at a pay rate of "X" per hour. Requires 6+ years in Data Engineering, expertise in Apache Beam, BigQuery, Hadoop, and strong SQL skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Irving, TX
-
🧠 - Skills detailed
#Data Modeling #Computer Science #Apache Beam #Hadoop #Datasets #Security #Data Engineering #Scala #SQL (Structured Query Language) #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Processing #Data Quality #Data Science #Cloud #BigQuery #GCP (Google Cloud Platform) #Big Data
Role description
Please share suitable profiles to srikanth@cloudingest.com Job Title: GCP Data Engineer Industry: Telecom Location: Irving, TX (Onsite) Experience: 6+ years relevant Data Engineering | 10+ years overall IT experience Job Description: We are seeking a skilled GCP Data Engineer to design, develop, and maintain scalable data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate will have strong experience with Big Data technologies and hands-on expertise in building robust ETL and data processing workflows. Key Responsibilities: β€’ Design and implement data pipelines using Apache Beam and other ETL frameworks. β€’ Develop and manage datasets in BigQuery for analytics and reporting. β€’ Work with Hadoop and Hive for large-scale data processing. β€’ Collaborate with data scientists, analysts, and business teams to deliver high-quality data solutions. β€’ Ensure data quality, performance, and security across pipelines. Required Skills: β€’ Apache Beam, BigQuery, Hadoop, Hive (Mandatory Skills) β€’ Strong SQL and data modeling skills β€’ Experience in building ETL pipelines and handling large-scale datasets β€’ Cloud platform experience: GCP β€’ Strong problem-solving and communication skills Education: β€’ Bachelor’s degree in Computer Science, Information Technology, or related field