Cloud Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Engineer on a 6-month remote contract, paying "pay rate." Key skills include GCP expertise, Terraform development (5-7+ years), and database modeling. Experience in data engineering tools like Apache Spark and Hadoop is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 17, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Fixed Term
🔒 - Security clearance
Unknown
📍 - Location detailed
New Jersey, United States
🧠 - Skills detailed
#Data Manipulation #Google Cloud Storage #Leadership #Database Design #Cloud #Data Modeling #Terraform #Pandas #Storage #Hadoop #Libraries #GCP (Google Cloud Platform) #Data Engineering #Dataflow #BigQuery #PySpark #NumPy #SciPy #Spark (Apache Spark) #Apache Spark #Python #SQL (Structured Query Language)
Role description
100% remote Will need to take technical assessment as part of interview process Top 3 Skills - Knowledge of GCP environment, Terraform development exp, and database modeling exp Job Description We are looking for a GCP Engineer for a 6-month, fully remote contract opportunity. The selected candidate will be responsible for the definition, development, and implementation of new systems, and major enhancements to existing systems, as well as production support for systems with high complexity. What You’ll Do • Utilize expertise in Google Cloud Platform (GCP) and data engineering tools like Apache Spark, Hadoop, and Hive to drive innovative solutions and ensure robust system functionality • Provide project leadership for major feasibility or business systems analysis studies • Design and implement cloud-based data engineering solutions using GCP services such as Google Cloud Storage, BigQuery, and Cloud SQL • Offer production support for highly complex systems, troubleshooting issues, and optimizing performance • Conduct data preprocessing and feature engineering tasks using GCP tools like Cloud Dataflow and BigQuery What You’ll Need Required • Bachelor’s Degree or additional years of experience • 5-7+ years of Terraform development experience • Experience with Google Cloud Platform (GCP) • Experience with data engineering tools and technologies, such as Apache Spark, Hadoop, or Hive • Experience with Hadoop and Hive • Experience with Spark and PySpark • Familiarity with cloud computing concepts, such as virtual machines, storage, and networking • Experience with cloud-based data engineering tools and services • Experience with Google Cloud Platform (GCP) • Familiarity with GCP services, such as Google Cloud Storage, Cloud Functions, Pub/Sub, Cloud Scheduler, Cloud Run, BigQuery, and Cloud SQL • Experience with GCP APIs and SDKs • Familiarity with data modeling and database design • Strong SQL skills • Strong problem-solving and analytical skills • Excellent communication and teamwork skills Preferred • Experience writing Python scripts for data engineering tasks • Familiarity with Python libraries for data manipulation and analysis, such as NumPy, Pandas, and SciPy • Healthcare or insurance background