

Cloud Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Cloud Data Engineer contract position, lasting up to 40 hours per week, with a pay rate of $40.00 - $60.00 per hour. Key skills include Python, PySpark, Airflow, and experience with GCP or AWS; healthcare data experience is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
May 14, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Remote
🧠 - Skills detailed
#Data Engineering #Cloud #BigQuery #GCP (Google Cloud Platform) #Kubernetes #Storage #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Spark (Apache Spark) #Data Pipeline #Data Warehouse #Redshift #PySpark #Python #Big Data #Airflow
Role description
Job OverviewWe are seeking a skilled and motivated Data Engineer to join our stealth startup for a short term contract (with possible extension). The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and architectures to support our data-driven initiatives. This role requires a strong understanding of big data technologies and cloud services.
Responsibilities:
Design, build, and maintain ETL pipelines using PySpark
Orchestrate workflows using Google Cloud Composer or Airflow on GKE
Monitor and optimize job performance, data reliability, and resource usage
Design and implement data models for efficient consumption and storage
Skills and Qualifications:
A minimum of 4-5 years of professional experience with the following:
Python
PySpark
Airflow
GCP and/or AWS
Strong experience designing and building data pipelines that scale
Hands-on experience configuring and managing Kubernetes clusters
Deep understanding of BigQuery/Redshift data warehouses
Nice-to-haves:
Prior experience with healthcare data
Job Types: Contract, Temporary
Pay: $40.00 - $60.00 per hour
Expected hours: 40 per week
Compensation Package:
Weekly pay
Schedule:
8 hour shift
Work Location: Remote
Job OverviewWe are seeking a skilled and motivated Data Engineer to join our stealth startup for a short term contract (with possible extension). The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and architectures to support our data-driven initiatives. This role requires a strong understanding of big data technologies and cloud services.
Responsibilities:
Design, build, and maintain ETL pipelines using PySpark
Orchestrate workflows using Google Cloud Composer or Airflow on GKE
Monitor and optimize job performance, data reliability, and resource usage
Design and implement data models for efficient consumption and storage
Skills and Qualifications:
A minimum of 4-5 years of professional experience with the following:
Python
PySpark
Airflow
GCP and/or AWS
Strong experience designing and building data pipelines that scale
Hands-on experience configuring and managing Kubernetes clusters
Deep understanding of BigQuery/Redshift data warehouses
Nice-to-haves:
Prior experience with healthcare data
Job Types: Contract, Temporary
Pay: $40.00 - $60.00 per hour
Expected hours: 40 per week
Compensation Package:
Weekly pay
Schedule:
8 hour shift
Work Location: Remote