

Jobs via Dice
Data Engineer (Fixed Income / Capital Markets)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Fixed Income / Capital Markets) in New York City, offering a long-term contract. Requires 10+ years of experience, expertise in Kafka, Snowflake, and Python, and strong domain knowledge in financial markets.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Spark (Apache Spark) #Compliance #SQL (Structured Query Language) #Data Pipeline #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Warehouse #Kafka (Apache Kafka) #Cloud #Data Processing #Automation #Snowflake #Scripting #MIFID (Markets in Financial Instruments Directive) #Data Manipulation #Data Integration #Programming #Python #Batch #Data Modeling #Azure #Scala #Data Engineering
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tekfortune Inc., is seeking the following. Apply via Dice today!
Role: Senior Data Engineer (Fixed Income / Capital Markets)
Location: New York City, NY
Duration: Long Term Contract
Exp: 10+ Years
Need local resumes with strong Fixed Income / Capital Markets Domain experience.
Job Description:
Lead the development and optimization of batch and real-time data pipelines, ensuring scalability, reliability, and performance.
Architect, design, and deploy data integration, streaming, and analytics solutions leveraging Spark, Kafka, and Snowflake.
Ability to help voluntarily and proactively, and support Team Members, Peers to deliver their tasks to ensure End-to-end delivery.
Evaluates technical performance challenges and recommend tuning solutions.
Hands-on knowledge of Data Service Engineer to design, develop, and maintain our Reference Data System utilizing modern data technologies including Kafka, Snowflake, and Python.
Required Skills:
• Proven experience in building and maintaining data pipelines, especially using Kafka, Snowflake, and Python.
• Strong expertise in distributed data processing and streaming architectures.
• Experience with Snowflake data warehouse platform: data loading, performance tuning, and management.
• Proficiency in Python scripting and programming for data manipulation and automation.
• Familiarity with Kafka ecosystem (Confluent, Kafka Connect, Kafka Streams).
• Knowledge of SQL, data modeling, and ETL/ELT processes.
• Understanding of cloud platforms (AWS, Azure, Google Cloud Platform) is a plus.
Domain Knowledge in any of the below area:
• Trade Processing, Settlement, Reconciliation, and related back/middle-office functions within financial markets (Equities, Fixed Income, Derivatives, FX, etc.).
• Strong understanding of trade lifecycle events, order types, allocation rules, and settlement processes.
• Funding Support, Planning & Analysis, Regulatory reporting & Compliance.
• Knowledge of regulatory standards (such as Dodd-Frank, EMIR, MiFID II) related to trade reporting and lifecycle management.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tekfortune Inc., is seeking the following. Apply via Dice today!
Role: Senior Data Engineer (Fixed Income / Capital Markets)
Location: New York City, NY
Duration: Long Term Contract
Exp: 10+ Years
Need local resumes with strong Fixed Income / Capital Markets Domain experience.
Job Description:
Lead the development and optimization of batch and real-time data pipelines, ensuring scalability, reliability, and performance.
Architect, design, and deploy data integration, streaming, and analytics solutions leveraging Spark, Kafka, and Snowflake.
Ability to help voluntarily and proactively, and support Team Members, Peers to deliver their tasks to ensure End-to-end delivery.
Evaluates technical performance challenges and recommend tuning solutions.
Hands-on knowledge of Data Service Engineer to design, develop, and maintain our Reference Data System utilizing modern data technologies including Kafka, Snowflake, and Python.
Required Skills:
• Proven experience in building and maintaining data pipelines, especially using Kafka, Snowflake, and Python.
• Strong expertise in distributed data processing and streaming architectures.
• Experience with Snowflake data warehouse platform: data loading, performance tuning, and management.
• Proficiency in Python scripting and programming for data manipulation and automation.
• Familiarity with Kafka ecosystem (Confluent, Kafka Connect, Kafka Streams).
• Knowledge of SQL, data modeling, and ETL/ELT processes.
• Understanding of cloud platforms (AWS, Azure, Google Cloud Platform) is a plus.
Domain Knowledge in any of the below area:
• Trade Processing, Settlement, Reconciliation, and related back/middle-office functions within financial markets (Equities, Fixed Income, Derivatives, FX, etc.).
• Strong understanding of trade lifecycle events, order types, allocation rules, and settlement processes.
• Funding Support, Planning & Analysis, Regulatory reporting & Compliance.
• Knowledge of regulatory standards (such as Dodd-Frank, EMIR, MiFID II) related to trade reporting and lifecycle management.






