KDB+/Q Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a KDB+/Q Engineer, onsite in NYC, with a contract duration of 4+ months at a pay rate of $85-$95/hour. Requires 10+ years of data engineering experience, 7+ years in KDB+/q, and strong knowledge of market data systems.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
760
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Deployment #Visualization #Observability #Data Quality #Grafana #S3 (Amazon Simple Storage Service) #Batch #DevOps #Compliance #.Net #Scripting #BI (Business Intelligence) #REST (Representational State Transfer) #Cloud #Monitoring #Tableau #Python #C++ #kdb+ #Migration #Code Reviews #Linux #Data Pipeline #Kafka (Apache Kafka) #Strategy #Programming #Java #Documentation #Shell Scripting #Data Engineering #Storage #Logging
Role description
Job Title: KDB+/Q Engineer (Contract) Company: Big Four Client Location: NYC, NY (onsite) Work Authorization: U.S. Citizen or Green Card Holder Pay Rate: $85-$95 per hour (W2) Duration: 4+ Months Overview: We are seeking an experienced KDB+/Q engineer to design, build, and optimize low-latency time-series data pipelines and analytics used by trading, quant, and risk teams. This is an onsite role in NYC, for one of our Big Four clients, working closely with quants, traders, and platform engineering to deliver high-performance market data and analytics solutions. Key Responsibilities: β€’ Own the end-to-end KDB+ architecture: tickerplant/rdb/hdb design, schema evolution, and storage strategy (splayed/partitioned tables, attributes, compression). β€’ Build and optimize real-time and historical data pipelines for market data and trade/order events; implement robust feed handlers and entitlements. β€’ Develop high-performance q code and APIs for analytics, research, and production use cases (asof joins, windowed aggregations, order book analytics). β€’ Integrate KDB+ with surrounding systems (e.g., Python/PyKX, Java/C++ gateways, Kafka, REST/WebSocket services, files/FTP, cloud object storage). β€’ Implement monitoring, alerting, and capacity planning; tune performance across memory, disk, and IPC; troubleshoot latency and data-quality issues. β€’ Establish SDLC best practices for q: code reviews, unit/integration tests, CI/CD, versioning, and production release management. β€’ Collaborate with Market Data, Application Support, and DevOps/SRE on production readiness, incident response, and runbooks. β€’ Deliver clear documentation, handover materials, and knowledge transfer by contract end. Required Skills & Qualifications: β€’ 10+ years of professional data engineering experience, with deep expertise in time-series and columnar data. β€’ 7+ years hands-on KDB+/q experience in production environments supporting trading, market data, or risk platforms. β€’ Expert-level q: idiomatic vector programming; joins (aj/aj0/uj/lj), windowed ops, keyed/splayed/partitioned tables, enumerations, attributes (p/s/u), upsert patterns. β€’ Strong knowledge of kdb+tick components (tickerplant, rdb, hdb), sym management, EOD processes, and schema/version migration. β€’ Proven low-latency optimization skills: memory/disk layout, IPC, batching, compression, partitioning strategy, and query tuning. β€’ Solid Linux engineering background, including shell scripting, networking fundamentals, and performance profiling. β€’ Practical experience with market data (e.g., ITCH/OUCH, FIX/FAST, proprietary exchange feeds), order book modeling, and tick-level analytics. β€’ Production discipline: observability (logging/metrics/tracing), incident management, and change control in regulated environments. β€’ Excellent communication and stakeholder management across trading, quant, and infrastructure teams; ability to operate autonomously on tight timelines Preferred Skills & Qualifications: β€’ Kafka or other pub/sub experience; schema registry and exactly-once or idempotent patterns. β€’ Python (PyKX), Java/C++/.NET gateways; building REST/WebSocket services for kdb+. β€’ Experience with KX Insights, cloud object storage (S3/GCS), or hybrid/on-prem deployments. β€’ BI/visualization integration (KX Dashboards, Grafana, Tableau) and entitlements/compliance for vendor/exchange data (Bloomberg, Refinitiv, direct feeds). β€’ Experience with backtesting/research platforms and data quality frameworks.