

Quartz/Big Data/Python Devloper
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Quartz/Big Data/Python Developer on a long-term contract in Jersey City, NJ, offering $68-$69/hr. Requires 6-10 years in technology, 3-5 years in Python and Big Data, and experience with HDFS, ETL, and financial instruments.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
552
-
ποΈ - Date discovered
August 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Jersey City, NJ
-
π§ - Skills detailed
#Big Data #Python #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Java #Hadoop #C# #Data Lake #PySpark #REST (Representational State Transfer) #Web Services #HDFS (Hadoop Distributed File System) #Impala #C++ #Agile #NoSQL
Role description
Our client, a leading financial services company is hiring a Big Data/Python Developer on a long-term contract basis.
Job ID 83197
Work Location:
Jersey City, NJ- Hybrid- 3 days onsite and 2 days REMOTE. Must be local and attend final in person interview.
Pay: $68-$69/hr W2.
Role Overview
Senior developer that will be responsible for building an Analytics platform for the CFO Group within Global Markets Business on the Hadoop infrastructure.
Role Specific Responsibilities:
β’ Responsible for creating a Finance Data Lake on HDFS
β’ Responsible for developing data transfer tools to replicate data from no-sql datastore to the DataLake
β’ Responsible for creating ETL jobs using PySpark
β’ Responsible for building REST web services and APIs that can be used by internal Quantitative team
β’ Participating fully in the development process through the entire software lifecycle.
β’ Participating fully in Agile software development process
GRANITE-VALUATION CONTROL ANALYTICS PLATFORM
β’ Minimum 6-10 years of overall technology experience is required.
Minimum 6-10 years of designing and developing in an object-oriented environment with any OO language is required. (i.e. Java, C++, C#, etc.)
3-5 years Python experience.
β’ 3-5 years of Big Data experience with expertise on HDFS, Impala, PySpark handling Petabytes of data.
β’ 6-10 years designing and tuning SQL database tables/queries is required.
β’ Experience with grid computing and high data volume a plus.
β’ Experience with object-oriented database a plus.
β’ Experience doing test-driven development, functional testing and continuous integration is desirable.
β’ Strong written and verbal communications
β’ Experience with financial instruments and Price Verification processes is desirable.
β’ Prior Quartz experience
β’ Demonstrable knowledge of problem domain in which they have been working.
Quartz, IPV, Valuation, Python, PySpark, HDFS, Impala, Peta bytes of Data
Our client, a leading financial services company is hiring a Big Data/Python Developer on a long-term contract basis.
Job ID 83197
Work Location:
Jersey City, NJ- Hybrid- 3 days onsite and 2 days REMOTE. Must be local and attend final in person interview.
Pay: $68-$69/hr W2.
Role Overview
Senior developer that will be responsible for building an Analytics platform for the CFO Group within Global Markets Business on the Hadoop infrastructure.
Role Specific Responsibilities:
β’ Responsible for creating a Finance Data Lake on HDFS
β’ Responsible for developing data transfer tools to replicate data from no-sql datastore to the DataLake
β’ Responsible for creating ETL jobs using PySpark
β’ Responsible for building REST web services and APIs that can be used by internal Quantitative team
β’ Participating fully in the development process through the entire software lifecycle.
β’ Participating fully in Agile software development process
GRANITE-VALUATION CONTROL ANALYTICS PLATFORM
β’ Minimum 6-10 years of overall technology experience is required.
Minimum 6-10 years of designing and developing in an object-oriented environment with any OO language is required. (i.e. Java, C++, C#, etc.)
3-5 years Python experience.
β’ 3-5 years of Big Data experience with expertise on HDFS, Impala, PySpark handling Petabytes of data.
β’ 6-10 years designing and tuning SQL database tables/queries is required.
β’ Experience with grid computing and high data volume a plus.
β’ Experience with object-oriented database a plus.
β’ Experience doing test-driven development, functional testing and continuous integration is desirable.
β’ Strong written and verbal communications
β’ Experience with financial instruments and Price Verification processes is desirable.
β’ Prior Quartz experience
β’ Demonstrable knowledge of problem domain in which they have been working.
Quartz, IPV, Valuation, Python, PySpark, HDFS, Impala, Peta bytes of Data