

Scala Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Scala Developer in NYC, NY, for 12 months at a competitive pay rate. Key skills include proficiency in Scala, Java experience, ETL pipeline development, and familiarity with SQL/NoSQL databases and cloud infrastructure (Azure).
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 30, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Snowflake #MongoDB #Redshift #Teradata #Computer Science #Scala #Kafka (Apache Kafka) #Databricks #Apache Spark #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Azure #PostgreSQL #NoSQL #Databases #Java #Airflow #Cloud
Role description
Position: Java Scala Developer
Location: NYC, NY (Looking for only locals or near by who can do in person)
Duration: 12 Months
Key Qualifications:
β’ Proficiency in Scala
β’ Must have some experience working with Java
β’ Experience with SQL and NoSQL databases (Teradata, PostgreSQL, MongoDB)
β’ Expertise in ETL pipeline development and data warehousing (Snowflake, Redshift)
β’ Familiarity with cloud infrastructure Azure
β’ Experience with Apache Spark, Kafka, Databricks, and Airflow
These roles provide an excellent opportunity to learn, grow, and work with leading technology in a fast-paced, innovative environment.
Educational Qualification:
β’ Minimum BS degree in Computer Science, Engineering or a related field
Position: Java Scala Developer
Location: NYC, NY (Looking for only locals or near by who can do in person)
Duration: 12 Months
Key Qualifications:
β’ Proficiency in Scala
β’ Must have some experience working with Java
β’ Experience with SQL and NoSQL databases (Teradata, PostgreSQL, MongoDB)
β’ Expertise in ETL pipeline development and data warehousing (Snowflake, Redshift)
β’ Familiarity with cloud infrastructure Azure
β’ Experience with Apache Spark, Kafka, Databricks, and Airflow
These roles provide an excellent opportunity to learn, grow, and work with leading technology in a fast-paced, innovative environment.
Educational Qualification:
β’ Minimum BS degree in Computer Science, Engineering or a related field