Professional-grade trading terminal built from the ground up to highlight potentially interesting market situations, deliver trading opportunities and signals, trading financial instruments on wide range of markets. It fuses a high-performance, real-time data visualization in web app with a robust, event-driven back end to provide traders with a critical edge.
This repository contains the full codebase, from the Next.js frontend to the Python-based data ingestion and trading services. The project is architected for future integration of AI-driven analysis and trade automation, aiming to evolve from a powerful manual trading tool into a comprehensive, intelligent automatic trading system.
- 📢 Real-Time Market Insights: Back-end scanners produce market insights (situations, opportunities, signals) in real time and stream them to the user.
- 📈 Multi-Chart Visualisation: Synchronized charts with key technical indicators (RSI, Volume, VWMA).
- ⚡ High-Resolution Market Depth: A CScalp-inspired UI with a real-time order book (DOM), live ticks, and dynamic volume clusters.
- ⌨️ Rapid Order Execution: Place orders directly from the chart or DOM using keyboard shortcuts and mouse clicks, with semi-automated risk management.
- A high-fidelity Market Replay Engine for backtesting manual and algorithmic startegies.
- An AI Mentor to analyze trading patterns and optimise trading strategy.
- Receive trading signals from AI Market Analyser.
- Strategy execution by AI Trader trained on market data and user trades.
Hyper Scalper is built as a modular monolith, following Domain-Driven Design principles and employing a functional-reactive, event-driven architecture. This approach ensures clear separation of concerns, scalability, and efficient real-time data streaming.
Key Principles:
- Asynchronous Communication: Bounded contexts (modules) communicate asynchronously via events over a message broker (Kafka). This decouples services and ensures resilience.
- Immutable Data Streams: Pipelines are designed as compositions of functions that operate on immutable data streams, promoting predictable and testable logic.
The backend is structured into distinct modules, each representing a bounded context:
- Ingestion Context: Connects to data sources (e.g., Telegram, exchange WebSockets) and normalizes raw inputs into canonical event types.
- Detection Context: Processes market data streams to detect and emit
MarketSituationevents.- Example: “Volume spike on TUTUSDT on binance (usdt_perps): 6.31M in 14m, anomaly 10%.”
- Enrichment Context: Subscribes to
MarketSituationevents and enhances them with external context like news or symbol metadata. - Opportunity Context: Consumes enriched situations and news events to build and emit
TradingOpportunityevents.- Example: “Gap-up continuation possible on TUTUSDT 5m; momentum aligned; watch pullback to VWAP.”
- Signal Context: Consumes
TradingOpportunityevents to generate and emit actionableTradingSignalevents.- Example: “Long TUTUSDT @ 0.0701; SL 0.0689; TP1 0.0716; TP2 0.0730.”
- Delivery Context: Subscribes to all
MarketInsightevents (situations, opportunities, signals) to stream them to the UI via WebSockets and persist them to storage. - Orchestration Context: Manages and schedules the end-to-end data pipelines (scanners).
All market data, user trading history, and configuration are persistently stored in a TimescaleDB instance, optimized for time-series data.
RESTful endpoints provide access to resources like market data, orders, and positions. Real-time market insights (situations, opportunities, signals) are delivered to the frontend via WebSockets.
Developed with Next.js and TypeScript, the web application provides a rich, interactive user interface. It features synchronized multi-chart layouts, a high-resolution market depth (DOM) with live trades and volume clusters, and a real-time signal feed. It is designed for rapid order execution and visual management of trades.
Web App: TypeScript 5, Next.js, Tailwind CSS, TradingView Lightweight Charts Back End: Python 3.13, FastAPI, Pydantic, Telethon, CCXT, Kafka, TimescaleDB