Bernard Nickels & Associates

Data Engineer (KDB+ / Q)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (KDB+ / Q) on a contract basis through 12/31/2025, based in New York, NY. Pay rate is $90-$105/hour. Requires 5+ years in data engineering, expertise in KDB+, Q, Ansible, and financial services experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
840
-
πŸ—“οΈ - Date
October 17, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Databases #Programming #Data Ingestion #Ansible #Data Engineering #Azure #Scala #Consulting #Kafka (Apache Kafka) #Data Analysis #Shell Scripting #Data Pipeline #Python #GCP (Google Cloud Platform) #Scripting #Cloud #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Data Processing
Role description
Job Title: Data Engineer (KDB+ / Q) Job Type: Contract (W2) Contract Duration: ASAP through 12/31/2025 (with potential for extension) Work Schedule: Monday-Friday, 8 hours per day, 40 hours per week (standard business hours) Location: New York, NY (onsite at end-client's office 4 days per week, with potential for 5 days) Compensation: $90 to $105 per hour Overview: A Big Four consulting firm is seeking an experienced Data Engineer with deep expertise in KDB+ and Q to support a high-performance financial services engagement. This role requires an individual with hands-on experience designing, implementing, and optimizing KDB+ time-series databases and developing in Q to enable ultra-low latency data processing. The successful candidate will work directly with client stakeholders in a front-office environment, supporting data-driven initiatives that demand millisecond-level efficiency. Role Breakdown: β€’ Architecture knowledge: 10% β€’ Operational: 25% β€’ KDB Dev: 65% Responsibilities: β€’ Design, develop, and maintain KDB+ time-series databases to support real-time and historical data analysis. β€’ Write and optimize Q scripts for data ingestion, transformation, querying, and analytics. β€’ Collaborate with quantitative analysts, traders, and technology teams to implement data solutions that enable high-performance trading and risk management. β€’ Ensure database scalability, stability, and performance tuning for ultra-low latency applications. β€’ Build robust data pipelines for ingesting large volumes of structured and unstructured market and transactional data. β€’ Troubleshoot production issues and provide ongoing support for mission-critical systems. β€’ Document system designs, processes, and technical specifications. Required Qualifications: β€’ High school diploma (or GED/equivalent). β€’ 5+ years of professional experience as a Data Engineer in a similar capacity. β€’ Strong hands-on expertise in KDB+ and Q programming language (including writing code). β€’ Extensive Ansible configuration and Shell scripting experience. β€’ Proven experience with time-series databases in trading, quantitative research, or financial services. β€’ Solid understanding of market data, order flows, and real-time financial systems. β€’ Ability to work in a fast-paced environment with tight deadlines. β€’ Excellent communication and collaboration skills, with prior experience working onsite with clients. Preferred Qualifications: β€’ A bachelor's (or advanced) college degree. β€’ Experience with data engineering frameworks (e.g., Python, Spark, Kafka). β€’ Knowledge of cloud platforms (AWS, GCP, or Azure) for large-scale data processing. β€’ Background in quantitative finance or algorithmic trading support.