Revolution Technologies

Senior Data Architect (KDB +/ Q)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect (KDB+/Q) on a 12-month remote contract, offering expertise in KDB+ and Q programming. Key skills include data engineering, cloud environments, and experience with time-series datasets.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 25, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Compliance #Programming #Cloud #Datasets #GIT #Data Pipeline #Automation #Scripting #Data Engineering #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Azure #Data Architecture #GCP (Google Cloud Platform) #Process Automation #Data Modeling #Integration Testing #Databases #Time Series #AWS (Amazon Web Services) #Python #Agile
Role description
Position Title: Data Engineer Contractor (KDB+/Q) Location: Remote Contract Type: 12-Month Contract Position Overview: We are seeking an experienced Data Engineer Contractor to join our project team and provide technical expertise in developing, implementing, and maintaining high-performance data engineering solutions. Key Responsibilities: • Design, develop, and maintain systems utilizing KDB+ time series databases and the Q programming language. • Collaborate with cross-functional teams to gather requirements and translate business needs into technical solutions. • Perform system integration, testing, and validation to ensure reliability and optimal performance. • Conduct data modeling, ETL development, and process automation for time-series and real-time data. • Troubleshoot, optimize, and resolve technical issues to maintain data system functionality and uptime. • Document technical specifications, data flows, and procedures to ensure compliance and reusability. Required Skills & Qualifications: • Proven experience as a Data Engineer or KDB+ Developer working with large time-series datasets. • Proficiency in KDB+, Q, and experience with streaming data pipelines. • Strong understanding of data architecture, system design, and performance tuning. • Experience working with cloud environments (AWS, Azure, or GCP) is a plus. • Excellent problem-solving and communication skills with the ability to collaborate in a distributed team environment. Preferred Skills: • Knowledge of Python, SQL, or other scripting languages. • Exposure to financial data systems, market data, or real-time analytics. • Familiarity with CI/CD pipelines, Git, and Agile methodologies.