

Data Engineer with Scala and Spark
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Scala and Spark in NYC, requiring 8-9 years of experience. Key skills include strong Scala proficiency, Apache Spark, SQL, and familiarity with cloud platforms. On-site work is mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, United States
-
π§ - Skills detailed
#Data Engineering #Kubernetes #Programming #HBase #Hadoop #AWS (Amazon Web Services) #Agile #Delta Lake #Kafka (Apache Kafka) #Docker #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Cloud #Batch #GCP (Google Cloud Platform) #Spark (Apache Spark) #Scala #Azure #Databricks #Version Control #Airflow #Apache Spark #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job : Data Engineer with scala and spark
Location : NYC
Experience : 8-9 years
Skill :
β’ Strong proficiency in Scala programming.
β’ Solid hands-on experience with Apache Spark (Batch and/or Streaming).
β’ Familiarity with Hadoop ecosystem, Hive, Kafka, or HBase.
β’ Experience with SQL and data transformation logic.
β’ Understanding of software engineering best practices (version control, CI/CD, testing).
β’ Experience with cloud platforms such as AWS, Azure, or GCP is a plus.
β’ Experience with Delta Lake, Databricks, or Snowflake.
β’ Familiarity with containerization and orchestration (Docker, Kubernetes, Airflow).
β’ Experience working in an Agile environment.
β’ Knowledge of data warehousing and ETL tools.