

OnlyGenius
Build MT4/MT5 Data Infrastructure + Real-Time Metrics & TCA System (MetaAPI-Level)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Backend/Data Infrastructure Engineer with experience in trading systems, focusing on building a scalable MT4/MT5 data infrastructure. Contract length is unspecified, with a pay rate of "unknown". Key skills include Python, Node.js, and database proficiency.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 7, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Storage #AWS (Amazon Web Services) #Monitoring #REST (Representational State Transfer) #React #API (Application Programming Interface) #Data Extraction #Kubernetes #Documentation #C# #Load Balancing #Data Pipeline #Cloud #Deployment #Logging #PostgreSQL #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Docker #Databases #Azure #Data Engineering #Scala #Python #Web API
Role description
DESCRIPTION
We are OnlyGenius β a quantitative trading technology company currently operating across MT4 and MT5 environments.
We are looking for a Senior Backend / Data Infrastructure Engineer with a proven background in trading systems or broker environments to help us build our own professional-grade infrastructure for real-time data extraction and analysis.
Our goal is to replace MetaAPI with an in-house, scalable solution that centralizes all account metrics, supports thousands of MT4/MT5 accounts, and integrates advanced Transaction Cost Analysis (TCA) capabilities in later phases.
Context
At present, we use MetaAPI to extract trading metrics from MT4/MT5 accounts and feed them into our internal dashboards β one for clients (User Dashboard) and another for the OnlyGenius team (Admin Dashboard).
This approach works but is expensive, limited, and dependent on an external provider.
We now want to build our own fully independent infrastructure, allowing real-time synchronization of account data (balance, equity, open trades, history, commissions, swaps, PnL, and more), stored in a central database and displayed live on both dashboards.
The system must be centralized, professional, and cloud-based, not a simple local script or plugin per client β it must support growth to hundreds or thousands of accounts efficiently.
Core Objectives
Build a MetaAPI-like infrastructure
Connect directly to MT4/MT5 servers through Manager or Web API to extract real-time data from accounts.
Handle large-scale multi-account connections with stable and low-latency streaming.
Ensure all metrics update live (balance, equity, trades, exposure, risk, floating PnL).
Data Pipeline & Storage
Normalize, structure, and store all incoming trade and account data in a centralized time-series database such as ClickHouse, TimescaleDB, or InfluxDB.
Optimize for fast read/write operations and analytical queries.
API Layer Development
Create a robust REST and WebSocket API layer to deliver real-time and historical data to our dashboards.
Must support scalable authentication, roles, and granular access control for both Admin and User views.
Dashboard Integration
Ensure seamless integration with our existing internal dashboards (built in React/Next.js).
Real-time visual updates as balances, trades, and profits change.
Allow for filters, trade history analysis, and account comparison in real time.
Deployment & Scalability
Deploy in cloud (AWS, Azure, or GCP) using containerized services (Docker, Kubernetes).
Ensure redundancy, load balancing, and horizontal scalability for thousands of accounts.
Provide logging, error handling, and uptime monitoring.
Cost Efficiency & Documentation
Conduct a detailed cost analysis comparing current MetaAPI expenses versus internal infrastructure maintenance and hosting.
Document the full architecture, deployment process, and data flow for internal technical teams.
Phase 2 β Transaction Cost Analysis (TCA)
After building the main infrastructure, extend the system to calculate TCA metrics:
Slippage analysis (expected vs actual fill prices).
VWAP / TWAP comparisons.
Markouts (performance post-trade).
Spread impact and latency distributions.
Toxic flow / execution quality analytics.
Output of this data will feed into our internal performance dashboards to audit fill quality and execution performance.
Who Weβre Looking For
A professional with deep understanding of trading technology and data systems β ideally someone who has built or maintained data infrastructures for brokers, liquidity providers, or hedge funds.
Required profile:
Senior Backend or Data Infrastructure Engineer.
Experience with MetaTrader Manager API, MetaTrader Web API, or MetaAPI.
Proven work in brokerage, prop firm, or financial technology environments.
Strong knowledge of Python, Node.js, or C# for backend development.
Expertise in real-time data streaming, API design, and distributed systems.
Database proficiency: PostgreSQL, ClickHouse, TimescaleDB, or InfluxDB.
Familiarity with Docker, Kubernetes, and cloud orchestration (AWS, GCP, or Azure).
Understanding of FIX API, TCA methodologies, and execution analytics is a strong advantage.
Demonstrated ability to build scalable, low-latency systems for financial data.
Deliverables
A fully operational centralized infrastructure (MetaAPI alternative) connecting to MT4/MT5 servers.
A backend data engine that collects and streams live trading metrics for each connected account.
REST + WebSocket API feeding our Admin and User dashboards in real time.
Deployment and scaling documentation with cost efficiency report.
Optional (Phase 2): TCA analytics module integrated with our existing infrastructure.
Technical Targets
Real-time latency under 1 second.
99.9% uptime.
Scalable to 1,000+ accounts without performance degradation.
Fully centralized architecture (no per-account code).
Secure handling of account credentials and API tokens.
Skills Required
MetaTrader Manager / Web API
MetaAPI or similar systems (MT Proxy, FX Blue Enterprise, etc.)
Backend Development (Python / Node.js / C#)
REST / WebSocket API Development
Real-time Data Streaming
High-performance Databases (ClickHouse, TimescaleDB, PostgreSQL)
Docker / Kubernetes / Cloud Infrastructure
TCA & Execution Analytics (VWAP/TWAP, Slippage, Markouts)
Financial Data Engineering
FIX API Knowledge
DESCRIPTION
We are OnlyGenius β a quantitative trading technology company currently operating across MT4 and MT5 environments.
We are looking for a Senior Backend / Data Infrastructure Engineer with a proven background in trading systems or broker environments to help us build our own professional-grade infrastructure for real-time data extraction and analysis.
Our goal is to replace MetaAPI with an in-house, scalable solution that centralizes all account metrics, supports thousands of MT4/MT5 accounts, and integrates advanced Transaction Cost Analysis (TCA) capabilities in later phases.
Context
At present, we use MetaAPI to extract trading metrics from MT4/MT5 accounts and feed them into our internal dashboards β one for clients (User Dashboard) and another for the OnlyGenius team (Admin Dashboard).
This approach works but is expensive, limited, and dependent on an external provider.
We now want to build our own fully independent infrastructure, allowing real-time synchronization of account data (balance, equity, open trades, history, commissions, swaps, PnL, and more), stored in a central database and displayed live on both dashboards.
The system must be centralized, professional, and cloud-based, not a simple local script or plugin per client β it must support growth to hundreds or thousands of accounts efficiently.
Core Objectives
Build a MetaAPI-like infrastructure
Connect directly to MT4/MT5 servers through Manager or Web API to extract real-time data from accounts.
Handle large-scale multi-account connections with stable and low-latency streaming.
Ensure all metrics update live (balance, equity, trades, exposure, risk, floating PnL).
Data Pipeline & Storage
Normalize, structure, and store all incoming trade and account data in a centralized time-series database such as ClickHouse, TimescaleDB, or InfluxDB.
Optimize for fast read/write operations and analytical queries.
API Layer Development
Create a robust REST and WebSocket API layer to deliver real-time and historical data to our dashboards.
Must support scalable authentication, roles, and granular access control for both Admin and User views.
Dashboard Integration
Ensure seamless integration with our existing internal dashboards (built in React/Next.js).
Real-time visual updates as balances, trades, and profits change.
Allow for filters, trade history analysis, and account comparison in real time.
Deployment & Scalability
Deploy in cloud (AWS, Azure, or GCP) using containerized services (Docker, Kubernetes).
Ensure redundancy, load balancing, and horizontal scalability for thousands of accounts.
Provide logging, error handling, and uptime monitoring.
Cost Efficiency & Documentation
Conduct a detailed cost analysis comparing current MetaAPI expenses versus internal infrastructure maintenance and hosting.
Document the full architecture, deployment process, and data flow for internal technical teams.
Phase 2 β Transaction Cost Analysis (TCA)
After building the main infrastructure, extend the system to calculate TCA metrics:
Slippage analysis (expected vs actual fill prices).
VWAP / TWAP comparisons.
Markouts (performance post-trade).
Spread impact and latency distributions.
Toxic flow / execution quality analytics.
Output of this data will feed into our internal performance dashboards to audit fill quality and execution performance.
Who Weβre Looking For
A professional with deep understanding of trading technology and data systems β ideally someone who has built or maintained data infrastructures for brokers, liquidity providers, or hedge funds.
Required profile:
Senior Backend or Data Infrastructure Engineer.
Experience with MetaTrader Manager API, MetaTrader Web API, or MetaAPI.
Proven work in brokerage, prop firm, or financial technology environments.
Strong knowledge of Python, Node.js, or C# for backend development.
Expertise in real-time data streaming, API design, and distributed systems.
Database proficiency: PostgreSQL, ClickHouse, TimescaleDB, or InfluxDB.
Familiarity with Docker, Kubernetes, and cloud orchestration (AWS, GCP, or Azure).
Understanding of FIX API, TCA methodologies, and execution analytics is a strong advantage.
Demonstrated ability to build scalable, low-latency systems for financial data.
Deliverables
A fully operational centralized infrastructure (MetaAPI alternative) connecting to MT4/MT5 servers.
A backend data engine that collects and streams live trading metrics for each connected account.
REST + WebSocket API feeding our Admin and User dashboards in real time.
Deployment and scaling documentation with cost efficiency report.
Optional (Phase 2): TCA analytics module integrated with our existing infrastructure.
Technical Targets
Real-time latency under 1 second.
99.9% uptime.
Scalable to 1,000+ accounts without performance degradation.
Fully centralized architecture (no per-account code).
Secure handling of account credentials and API tokens.
Skills Required
MetaTrader Manager / Web API
MetaAPI or similar systems (MT Proxy, FX Blue Enterprise, etc.)
Backend Development (Python / Node.js / C#)
REST / WebSocket API Development
Real-time Data Streaming
High-performance Databases (ClickHouse, TimescaleDB, PostgreSQL)
Docker / Kubernetes / Cloud Infrastructure
TCA & Execution Analytics (VWAP/TWAP, Slippage, Markouts)
Financial Data Engineering
FIX API Knowledge