

Sharp Decisions
Azure Data Engineer (983)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer (983) on a contract basis in New York, NY. Key skills include Databricks, Kafka, Python, and Azure Data Factory. Requires experience in Fixed Income products and Market Liquidity data flows. W2 candidates only.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 2, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Storage #Scala #Azure Data Factory #Data Engineering #Kafka (Apache Kafka) #Python #Data Pipeline #Microsoft Power BI #Data Management #ADF (Azure Data Factory) #SQL (Structured Query Language) #BI (Business Intelligence) #Azure Blob Storage #Azure #Databricks #Vault #Data Analysis #Data Processing #Data Architecture
Role description
Contract Β Β·Β Poss. to Hire
Hybrid Β Β·Β New York, NY
W2 Candidates Only β No C2C, No 3rd-Party Agencies Please
Business Domain Knowledge
Fixed Income & Capital Markets
β’ Understanding of Fixed Income products, specifically US Treasuries and Futures data.
β’ Familiarity with Market Liquidity Order data flows and how they move through trading systems.
β’ Ability to collaborate across engineering, design, and vendor teams while aligning diverse stakeholder perspectives toward a unified vision.
Technical Skills
Required
Databricks Delta Share Kafka Python Autosys / JIL Azure Blob Storage Azure Key Vault Azure Data Factory SQL Data Analysis Report Writing
Nice to Have
Redline Trading Solutions ION FIX Protocol SFTP / File Transfer Power BI
Domain & Soft Skills
US Treasuries Futures Data Market Liquidity Order Data Flows Stakeholder Collaboration Cross-Team Alignment
Key Responsibilities
Build and maintain data pipelines using Databricks, Delta Share, Kafka, and Azure Data Factory.
Develop and support Python-based data processing workflows and Autosys/JIL job scheduling.
Work with Azure Blob Storage and Key Vault for secure, scalable data management.
Perform data analysis and SQL-based querying to support Fixed Income business reporting needs.
Collaborate with engineers, designers, and vendors to align on data architecture and delivery priorities.
Develop reports and data products supporting US Treasuries, Futures, and Market Liquidity order data flows.
Contract Β Β·Β Poss. to Hire
Hybrid Β Β·Β New York, NY
W2 Candidates Only β No C2C, No 3rd-Party Agencies Please
Business Domain Knowledge
Fixed Income & Capital Markets
β’ Understanding of Fixed Income products, specifically US Treasuries and Futures data.
β’ Familiarity with Market Liquidity Order data flows and how they move through trading systems.
β’ Ability to collaborate across engineering, design, and vendor teams while aligning diverse stakeholder perspectives toward a unified vision.
Technical Skills
Required
Databricks Delta Share Kafka Python Autosys / JIL Azure Blob Storage Azure Key Vault Azure Data Factory SQL Data Analysis Report Writing
Nice to Have
Redline Trading Solutions ION FIX Protocol SFTP / File Transfer Power BI
Domain & Soft Skills
US Treasuries Futures Data Market Liquidity Order Data Flows Stakeholder Collaboration Cross-Team Alignment
Key Responsibilities
Build and maintain data pipelines using Databricks, Delta Share, Kafka, and Azure Data Factory.
Develop and support Python-based data processing workflows and Autosys/JIL job scheduling.
Work with Azure Blob Storage and Key Vault for secure, scalable data management.
Perform data analysis and SQL-based querying to support Fixed Income business reporting needs.
Collaborate with engineers, designers, and vendors to align on data architecture and delivery priorities.
Develop reports and data products supporting US Treasuries, Futures, and Market Liquidity order data flows.






