Data Engineer (KDB+ / Q)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (KDB+ / Q) contract position in New York, NY, lasting until 12/31/2025, paying $90-$100 per hour. Requires 5+ years of experience, strong KDB+ and Q skills, and financial services industry expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
800
-
πŸ—“οΈ - Date discovered
September 27, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Ingestion #Consulting #Programming #Spark (Apache Spark) #GCP (Google Cloud Platform) #Python #Azure #Databases #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Engineering #Kafka (Apache Kafka) #Data Analysis #Cloud #Scala #Data Processing #Data Pipeline
Role description
Job Title: Data Engineer (KDB+ / Q) Job Type: Contract (W2) Contract Duration: ASAP through 12/31/2025 (with potential for extension) Work Schedule: Monday-Friday, 8 hours per day, 40 hours per week (standard business hours) Location: New York, NY (onsite at end-client's office 4 days per week, with potential for 5 days) Compensation: $90 to $100 per hour Overview: A Big Four consulting firm is seeking an experienced Data Engineer with deep expertise in KDB+ and Q to support a high-performance financial services engagement. This role requires an individual with hands-on experience designing, implementing, and optimizing KDB+ time-series databases and developing in Q to enable ultra-low latency data processing. The successful candidate will work directly with client stakeholders in a front-office environment, supporting data-driven initiatives that demand millisecond-level efficiency. Responsibilities: β€’ Design, develop, and maintain KDB+ time-series databases to support real-time and historical data analysis. β€’ Write and optimize Q scripts for data ingestion, transformation, querying, and analytics. β€’ Collaborate with quantitative analysts, traders, and technology teams to implement data solutions that enable high-performance trading and risk management. β€’ Ensure database scalability, stability, and performance tuning for ultra-low latency applications. β€’ Build robust data pipelines for ingesting large volumes of structured and unstructured market and transactional data. β€’ Troubleshoot production issues and provide ongoing support for mission-critical systems. β€’ Document system designs, processes, and technical specifications. Required Qualifications: β€’ High school diploma (or GED/equivalent). β€’ 5+ years of professional experience as a Data Engineer or similar role. β€’ Strong hands-on expertise in KDB+ and Q programming language. β€’ Proven experience with time-series databases in trading, quantitative research, or financial services. β€’ Solid understanding of market data, order flows, and real-time financial systems. β€’ Ability to work in a fast-paced environment with tight deadlines. β€’ Excellent communication and collaboration skills, with prior experience working onsite with clients. Preferred Qualifications: β€’ A bachelor's (or advanced) college degree. β€’ Experience with data engineering frameworks (e.g., Python, Spark, Kafka). β€’ Knowledge of cloud platforms (AWS, GCP, or Azure) for large-scale data processing. β€’ Background in quantitative finance or algorithmic trading support.