Jobs via Dice

Job Opening - Senior Data Engineer - Contract - NY

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in New York, NY, offering $87/HR on C2C or $77/HR on W2 for a contract position. Requires 10+ years of experience, strong skills in Python, Spark, ETL, and database systems, with financial industry experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
696
-
🗓️ - Date
April 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#C++ #SQL Server #Databricks #Database Systems #Programming #Java #Migration #Databases #ML (Machine Learning) #SQL (Structured Query Language) #Scripting #Triggers #Leadership #Cloud #Data Integration #PyTorch #Data Mart #Scala #Unix #Snowflake #"ETL (Extract #Transform #Load)" #Python #Spark (Apache Spark) #Statistics #Azure #RDBMS (Relational Database Management System) #Linux #Data Engineering #Teradata #Oracle #Data Warehouse #Angular #PySpark #Strategy #Informatica #Perl #Deployment
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, SANS, is seeking the following. Apply via Dice today! Position Details Rate: $87/HR on C2C AND $77/HR on W2 Location: New York, NY Onsite Interview Required Department Profile Global Banking Technology (GBT) is a dynamic and fast-paced area within the Firm s WM Technology Division. We are responsible for creating innovative technology solutions for the Private Banking Group (Client), one of the strategic growth areas of the Firm, providing cash management and lending products and services to our WM clients. This includes state-of-the-art technology for a nationwide network of Private Bankers and product specialists who work with Financial Advisors to provide access to products and services such as online banking, cards, deposit products, residential mortgages, securities-based loans and tailored lending. If you are an exceptional individual who is interested in solving complex problems and building sophisticated solutions in a dynamic team environment, GBT is the place for you. Position Description The position will be responsible for the development of ETL components, providing user access to the data via reports, data extracts, utilizing analysis tools such as OLAP, and for coding stored procedures. The candidate will be working with multiple database systems (Teradata, HIVE, SQL Server, DB2, and Snowflake) including Cloud system, both on prem and Cloud. The roll will require the candidate to possess a strong understanding of database concepts including data warehouse, operational data stores, and data marts. Responsibilities will also require in-depth knowledge of ETL concepts and hands-on experience in implementing data integrations in multiple database platforms using custom development, scripting language such as Unix Shell, and ETL tool such as Informatica. The Role requires the candidate to work on data engineering pipelines using Spark on Cloud with tools like Databricks and Snowflake, and work on Statistical Risk models developed on C++ to reengineer them with better latency for cloud using latest technologies like PyTorch and design reusable components, utilities and ability to think out-of-the-box architecture to have seamless experience for modelers from design to implementation to training to deployment of models to production lifecycle. Responsibilities: Work with Quantitative Strategist/Statistical Modeler to build, enhance, and execute/test scenarios Ability to develop, run and infer Machine Learning Models and Statistical Models on cloud Identify potential improvements to the current design/processes. Ability to assess risks in design/development upfront. Plan and co-ordinate the data/process migration across databases. Participate in multiple project discussions as a senior member of the team. Serve as a coach/mentor for junior developers. Provide thought leadership. Lead team in new initiatives such as cloud strategy. Required Skills 10 + years of total experience. Strong Python, Spark, PyTorch, PySpark, Java and C++ scripting experience Have knowledge on Machine Learning, Statistics and Model development, training and inference. Have strong design and problem-solving skills Strong understanding of machine learning tools ML Flow, Databricks, Snowflake on Cloud SQL skills and database programming skills including creating views, stored procedures, triggers, implementing referential integrity, as well as designing and coding for performance. Knowledge and hands-on experience of RDBMS systems (e.g.: DB2, Teradata, or Oracle, Snowflake). Good communication and leadership skills. Organization, discipline, detail-orientation, self-motivation, and focused on delivery. In-depth knowledge and hands-on experience Unix/Linux programming (shell and/or Perl). Desired Skills Experience in the Financial Industry Good understanding of Data engineering principles and risk model development. ETL experience with Informatica Experience in Cloud database (e.g.: AZURE, Snowflake). Experience with Scala/Spark/C++/PyTorch Experience with AngularJS Experience in KDB