RSC Solutions

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position with a 12-month contract, offering a pay rate of "X" per hour. It requires 10+ years of experience, proficiency in Azure services, and strong Python/PySpark skills, preferably in financial services. Location: Charlotte, N.C., Jersey City, N.J., or India.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 1, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Databricks #ADF (Azure Data Factory) #GIT #NoSQL #Jenkins #"ETL (Extract #Transform #Load)" #FastAPI #Azure Data Factory #Spark (Apache Spark) #PySpark #Azure #API (Application Programming Interface) #Consulting #Databases #Azure cloud #Data Engineering #Strategy #SQL (Structured Query Language) #Cloud #Azure Databricks #DevOps #Python #Data Strategy
Role description
Role Description SMBC is in the process of leading a Digital Transformation across our Americas Division as we continue to modernize our technology, focus on our data-driven approach, grow, and plan for the future. As a result of this expansion, we are seeking experienced software Data engineers spread across Charlotte (N.C), Jersey City (N.J) and India, with 10+ years of relevant experience to support the design and development of a strategic data platform for SMBC Capital Markets and Nikko Securities Group. Role Objectives β€’ These roles will be part of the Data Strategy team spanning across the SMBC Capital Markets and Nikko securities teams, SMBC Americas Division’s broker-dealer and swap-dealer entities. β€’ These roles will be involved in the active development of the data platform in close coordination with the SMBC team, beginning with the establishment of a reference data system for securities and pricing data, and later moving to other data domains. β€’ The consulting team will need to follow internal developments standards to contribute to the overall agenda of the Data Strategy team. β€’ The implementation of this strategic platform on SMBC’s Azure Cloud Platform will require solutions and know-how as listed in the qualifications below. Qualifications and Skills β€’ Full stack Python/PySpark developer β€’ Proven experience as a Data Engineer with experience in Azure cloud. β€’ Experience implementing solutions using - β€’ Azure cloud services β€’ Azure Data Factory β€’ Azure Lake Gen 2 β€’ Azure Databases β€’ Azure Data Fabric β€’ API Gateway management β€’ Azure Functions β€’ Well versed with Azure Databricks β€’ Strong SQL skills with RDMS or noSQL databases β€’ Experience with developing API’s using FastAPI or similar frameworks in Python β€’ Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes β€’ Good understanding of ETL/ELT processes β€’ Experience in financial services industry, financial instruments, asset classes and market data are a plus.