

KPG99 INC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 4+ years of GCP experience, focusing on data pipelines and warehouse solutions. Contract length is long-term, remote work is available, and candidates should have a Bachelor's degree in computer science.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Automated Testing #Scripting #Hadoop #BigQuery #Spark (Apache Spark) #Airflow #Data Warehouse #Python #Jira #Data Engineering #Java #Data Pipeline #Data Lake #BitBucket #Scrum #GCP (Google Cloud Platform) #RDBMS (Relational Database Management System) #Computer Science #Perl #Scala #Data Processing #Programming #Agile
Role description
Please find below the Job Description and let me know if you are interested?
Position: GCP Data Engineer (Previously Worked with Walmart)
Location: 100% Remote
Preferred: Independent Consultant
Duration: Long Term Contract
PREFERRED TO WORK ON W2 OR 1099
Required Skills & Experience:
GCP Experience
4+ years of recent GCP experience
Experience building data pipelines in GCP
GCP Dataproc, GCS & BIGQuery experience
5+ years of hands-on experience with developing data warehouse solutions and data products.
5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience. The most successful candidates will also have experience in the following:
Gitflow
• Atlassian products – BitBucket, JIRA, Confluence etc
Thanks and Regards
Karan Rajput | US IT Recruiter
Desk: 609-973-8207 || Phone: 201-351-8981 || KRajput@kpgtech.com
Please find below the Job Description and let me know if you are interested?
Position: GCP Data Engineer (Previously Worked with Walmart)
Location: 100% Remote
Preferred: Independent Consultant
Duration: Long Term Contract
PREFERRED TO WORK ON W2 OR 1099
Required Skills & Experience:
GCP Experience
4+ years of recent GCP experience
Experience building data pipelines in GCP
GCP Dataproc, GCS & BIGQuery experience
5+ years of hands-on experience with developing data warehouse solutions and data products.
5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience. The most successful candidates will also have experience in the following:
Gitflow
• Atlassian products – BitBucket, JIRA, Confluence etc
Thanks and Regards
Karan Rajput | US IT Recruiter
Desk: 609-973-8207 || Phone: 201-351-8981 || KRajput@kpgtech.com