Red Oak Technologies

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Sunnyvale, CA, lasting 12 months with a possible extension. Required skills include Apache Spark, Apache Flink, data modeling, ETL design, and proficiency in Python, Java, or Scala. Onsite work is mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date
March 27, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Scala #Azure #Apache Spark #Project Management #Storage #Kafka (Apache Kafka) #AWS (Amazon Web Services) #Cloud #Data Storage #Spark (Apache Spark) #GCP (Google Cloud Platform) #Data Lake #Java #Data Modeling #Data Engineering #Spark SQL #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
Since 1995, Red Oak Technologies has been a trusted partner in the tech industry, delivering innovative talent solutions that drive progress. We specialize in quickly acquiring and efficiently matching top-tier professional talent with clients in immediate need of highly skilled contract, permanent or project management based resources. β€œNOTE: If selected for this position, you are required to perform ALL work onsite, based on the client’s specified hybrid work schedule (currently onsite 3 days a week: Tuesday, Wednesday and Thursday).” Title: Data Engineer Location: Sunnyvale, CA Duration 12 months + possible extension Required Skills: β€’ Strong experience with: β€’ Apache Spark (Spark SQL, Structured Streaming) β€’ Apache Flink (event-time processing, stateful streams) β€’ Solid understanding of distributed systems β€’ Experience with data modeling and ETL design β€’ Proficiency in Python, Java, or Scala β€’ Familiarity with streaming systems (e.g., Kafka) β€’ Experience with cloud platforms (AWS / GCP / Azure) β€’ Knowledge of data storage systems (data lakes, warehouses)