

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 5–7+ years of experience in GCP, focusing on scalable data pipelines. Contract duration is 6+ months, fully remote, with a pay rate of "X". Key skills include Terraform, Apache Spark, and data modeling.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
June 12, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scala #Data Modeling #Database Design #BigQuery #SciPy #GCP (Google Cloud Platform) #PySpark #Cloud #SQL (Structured Query Language) #Storage #NumPy #Libraries #Automation #Apache Spark #Terraform #Data Engineering #Pandas #Big Data #Spark (Apache Spark) #Python #Scripting #Hadoop #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: GCP Data Engineer
Location: 100% Remote
Duration: 6+ Month Contract with good chances of extension
Must complete the "Data Engineer Assessment Test"
Job Title: GCP Data Engineer
We are seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP) and cloud-native data engineering tools. The ideal candidate will have a proven track record in building scalable data pipelines, working with modern big data tools, and supporting complex cloud environments.
Required Qualifications:
• Bachelor’s Degree or equivalent years of experience
• 5–7+ years of hands-on experience in data engineering
• Proficient in GCP, including services like BigQuery, Cloud Storage, Pub/Sub, Cloud Functions, Cloud Run, Cloud Scheduler, and Cloud SQL
• Strong experience with Apache Spark, PySpark, Hadoop, and Hive
• Solid understanding of cloud computing concepts: VMs, networking, and storage
• Hands-on experience with GCP APIs and SDKs
• Skilled in data modeling, database design, and SQL
• Strong problem-solving, analytical, and communication skills
Top 3 Required Skills:
• In-depth knowledge of GCP environment
• Terraform development for infrastructure automation
• Experience in database modeling
Preferred Qualifications:
• Experience scripting in Python for data tasks
• Familiarity with Python libraries like NumPy, Pandas, and SciPy
• Background in healthcare or insurance domain is a plus
• Additional hands-on experience with Hadoop and Hive