

Scala Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Scala Developer on an 18-month W2 contract in Jersey City, paying $55-$65/hour. Requires 5+ years of experience, proficiency in Scala, and expertise in Hadoop/Spark/Flink, with a preference for regulatory reporting experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date discovered
July 22, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#Spark (Apache Spark) #Data Analysis #Data Processing #Databases #Scala #HBase #Programming #"ETL (Extract #Transform #Load)" #Big Data #Hadoop #HDFS (Hadoop Distributed File System)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Scala Developer
Duration: 18 Month W2 Contract
Location:Jersey City
Required Pay Scale: $55-$65/hour
Financial regulation for banks has increased dramatically over the past few years. This role is to work on the Global Banking and Markets (GBAM) Non-Financial Regulatory Reporting (NFRR) Data Delivery program, a new initiative to define and implement consistent and efficient regulatory reporting processes that adhere to enterprise standards, simplify controls and enable re-use.
As part of this initiative, we are developing a Data Processing Framework using Scala and Big Data technologies for authoring domain-specific-language (DSL) components to seamlessly combine data from multiple data sources. The DSL components interpret transformation rules written in a configuration style syntax that can be applied to one or more standard data sets to produce a transformed data set. This framework provides a unified language for describing the data needs of a report.
• Enables users to seamlessly retrieve and combine data from multiple sources
• Enables users to author reports without having to worry about the mechanics of actually retrieving, filtering, projecting, or aggregating the data
• Ensures proper versioning of the report definitions and the running of the reports as they existed at specific points in time
• Ensures that the system is scalable enough to run hundreds of reports in parallel, if require
Role
We are looking for a Scala developer with Apache Hadoop/Spark/Flink experience to assist with the development of the regulatory reporting processing engine using Scala and Big Data technologies to provision regulatory reporting data via domain-specific-language (DSL) scripts. The role requires close partnership with the NFRR Program analysis and development teams.
Responsibilities
The candidate will work directly with the Director of the Data Processing Engine and will participate in all phases of development of the platform and subsequent reports.
• Develop technical specification and component level design for DSL Extensions for input and output
• Design parameterized and configurable modules for DSL Extensions
• Develop DSL for the data outputs needed for NFRR Reports
• Integrate Code Repository and Management
Qualifications
The candidate must be a self-starter, able to work in a fast paced and results driven environment with minimal oversight. The candidate is required to have excellent communications skills and possess a strong sense of accountability and responsibility.
• 5+ years development experience
• Good general Scala programming skills
• Experience with Hadoop Distributed File System (HDFS), HBase and Hive
• Experience with custom aggregations within Spark/Flink, preferred
• Experience with databases, a plus
• Experience on regulatory or reporting projects, preferred
• Ability to perform detailed and complex data analysis
• Attention to detail and ability to work independently
• Ability to handle tight deadlines, and competing demands in a fast paced environment
• Knowledge of Global Banking and Markets’ products/asset classes and associated data including fixed income, equities, derivatives, and foreign exchange securities, preferred
Scala developer with Apache Hadoop/Spark/Flink experience
5-7 years
About Matlen Silver
Experience Matters. Let your experience be driven by our experience. For more than 40 years, Matlen Silver has delivered solutions for complex talent and technology needs to Fortune 500 companies and industry leaders. Led by hard work, honesty, and a trusted team of experts, we can say that Matlen Silver technology has created a solutions experience and legacy of success that is the difference in the way the world works.
Matlen Silver is an Equal Opportunity Employer and considers all applicants for all positions without regard to race, color, religion, gender, national origin, age, sexual orientation, veteran status, the presence of a non-job-related medical condition or disability, or any other legally protected status.
If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at email and/or phone at: info@matlensilver.com // 908-393-8600