

Saransh Inc
Senior Data Engineer with Scala, Java & Spark - W2 Role
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in San Francisco, CA, on a 1-year W2 contract. Requires 2-5 years of Scala, Java, and Spark experience, along with SQL, NoSQL, and Big Data tools. Hybrid work model, 3 days onsite.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 9, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco, CA
-
🧠 - Skills detailed
#Data Warehouse #Impala #Teradata #Data Modeling #BigQuery #NiFi (Apache NiFi) #Spark (Apache Spark) #AWS (Amazon Web Services) #Redshift #Java #SQL (Structured Query Language) #Cloud #Azure #Big Data #Scala #Kafka (Apache Kafka) #Airflow #Data Engineering #NoSQL
Role description
Role: Senior Data Engineer
Location: San Francisco, CA (Hybrid - 3 days a week onsite)
Job Type: W2 Contract
Length: 1 year, can be extended
Note: Only Visa Independent candidates are required (No C2C or Third-party candidates)
Must Have: Scala, Java + Spark (can strong Java with some Scala + Spark)
Minimum Qualifications
• 2-5+ years of Scala, Java development experience.
• 2+ years of SQL and NoSQL experience (handling structured and unstructured data).
• 1+ years of extensive experience with Spark Processing engine.
• 1+ years of experience with Big data tools / technologies/ Streaming (Hive, Impala, OOZIE, Airflow, NIFI, Kafka)
• 1+ years experience with Data Modeling.
• Experience analyzing data to discover opportunities and address gaps.
• Experience working with cloud or on-prem Big Data platform(i.e. Netezza, Teradata, AWS Redshift, Google BigQuery, Azure Data Warehouse, or similar)
• Candidates Must work 3 days a week onsite from office.
Role: Senior Data Engineer
Location: San Francisco, CA (Hybrid - 3 days a week onsite)
Job Type: W2 Contract
Length: 1 year, can be extended
Note: Only Visa Independent candidates are required (No C2C or Third-party candidates)
Must Have: Scala, Java + Spark (can strong Java with some Scala + Spark)
Minimum Qualifications
• 2-5+ years of Scala, Java development experience.
• 2+ years of SQL and NoSQL experience (handling structured and unstructured data).
• 1+ years of extensive experience with Spark Processing engine.
• 1+ years of experience with Big data tools / technologies/ Streaming (Hive, Impala, OOZIE, Airflow, NIFI, Kafka)
• 1+ years experience with Data Modeling.
• Experience analyzing data to discover opportunities and address gaps.
• Experience working with cloud or on-prem Big Data platform(i.e. Netezza, Teradata, AWS Redshift, Google BigQuery, Azure Data Warehouse, or similar)
• Candidates Must work 3 days a week onsite from office.





