{"id":13651638,"url":"https://github.com/eosrio/Hyperion-History-API","last_synced_at":"2025-04-22T22:31:45.554Z","repository":{"id":37821473,"uuid":"170685567","full_name":"eosrio/hyperion-history-api","owner":"eosrio","description":"Scalable Full History API Solution for Antelope (former EOSIO) based blockchains","archived":false,"fork":false,"pushed_at":"2024-11-08T01:21:25.000Z","size":29230,"stargazers_count":125,"open_issues_count":16,"forks_count":72,"subscribers_count":16,"default_branch":"main","last_synced_at":"2024-11-08T02:24:57.061Z","etag":null,"topics":["antelope","api","big-data","blockchain","elasticsearch","eos","eosio","fio","history","history-api","indexing","monitoring","observability","telos","ultra","wax"],"latest_commit_sha":null,"homepage":"https://hyperion.docs.eosrio.io","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/eosrio.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"license.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-02-14T12:09:00.000Z","updated_at":"2024-10-31T23:12:00.000Z","dependencies_parsed_at":"2023-09-24T23:55:17.558Z","dependency_job_id":"dda7a30b-92d7-4aec-a79d-fcaa8f986f1b","html_url":"https://github.com/eosrio/hyperion-history-api","commit_stats":null,"previous_names":[],"tags_count":19,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eosrio%2Fhyperion-history-api","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eosrio%2Fhyperion-history-api/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eosrio%2Fhyperion-history-api/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/eosrio%2Fhyperion-history-api/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/eosrio","download_url":"https://codeload.github.com/eosrio/hyperion-history-api/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":223906292,"owners_count":17223045,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["antelope","api","big-data","blockchain","elasticsearch","eos","eosio","fio","history","history-api","indexing","monitoring","observability","telos","ultra","wax"],"created_at":"2024-08-02T02:00:51.126Z","updated_at":"2025-04-22T22:31:45.535Z","avatar_url":"https://github.com/eosrio.png","language":"TypeScript","readme":"\u003c!--suppress HtmlUnknownTarget, HtmlDeprecatedAttribute --\u003e\n\u003cbr\u003e\u003c/br\u003e\n\u003cp align=\"center\"\u003e\n  \u003cpicture\u003e\n    \u003csource media=\"(prefers-color-scheme: dark)\" srcset=\"https://eosrio.io/hyperion-white.png\"\u003e\n    \u003cimg alt=\"Hyperion Logo\"\n         src=\"https://eosrio.io/hyperion.png\"\u003e\n  \u003c/picture\u003e\n\u003c/p\u003e\n\n\n\u003ch4 align=\"center\"\u003e\n    Scalable Full History \u0026 State Solution for \n    \u003ca href=\"https://antelope.io\"\u003e\n        Antelope\n    \u003c/a\u003e\n    based blockchains \u003cbr\u003e\n\u003c/h4\u003e\n\n\u003cbr\u003e\n\n\u003cdiv align=\"center\"\u003e\n\nMade with ♥ by [Rio Blocks](https://rioblocks.io/?lang=en)\n\u003c/div\u003e\n\n\u003cdiv align=\"center\"\u003e\n\n![CI](https://github.com/eosrio/hyperion-history-api/actions/workflows/build.yml/badge.svg)\n\u003c/div\u003e\n\n\u003cdiv align=\"center\"\u003e\n\n## 📖 [Hyperion Docs - Official Documentation](https://hyperion.docs.eosrio.io)📖\n\u003c/div\u003e\n\n### How to use:\n\n - [For Infrastructure Providers](https://hyperion.docs.eosrio.io/providers/get-started/)\n\n - [For Developers](https://hyperion.docs.eosrio.io/dev/howtouse/)\n\n - [API Reference](https://hyperion.docs.eosrio.io/api/v2/)\n\n### Official plugins:\n\n- [Hyperion Lightweight Explorer](https://github.com/eosrio/hyperion-explorer-plugin)\n\n## 1. Overview\n\nHyperion is a high-performance, scalable solution designed to index, store, and retrieve the full history and current state of Antelope-based blockchains (formerly EOSIO). Antelope chains can generate vast amounts of data, demanding robust indexing, optimized storage, and efficient querying capabilities. Hyperion addresses these challenges by providing open-source software tailored for block producers, infrastructure providers, and dApp developers.\n\n**Key Features:**\n\n*   **Scalable Indexing:** Designed to handle high-throughput Antelope chains.\n*   **Full History:** Captures and stores every action and state change.\n*   **Optimized Data Structure:** Actions are stored flattened, with inline actions linked via transaction IDs, reducing redundancy (e.g., notifications identical to parent actions are omitted). Full blocks/transactions are reconstructed on demand, saving storage space.\n*   **Current State Indexing:** Optionally stores the latest state of specific contracts/tables in MongoDB for fast lookups.\n*   **Modern API (v2):** Offers comprehensive endpoints for history, state, and statistics. Legacy v1 API support is maintained for compatibility.\n*   **Live Streaming:** Provides real-time action and state delta streams via WebSockets.\n*   **Extensible:** Features a plugin system managed by the `hpm` tool.\n\n## 2. Core Concepts\n\nHyperion operates by separating the concerns of historical event streams and current on-chain state:\n\n1.  **Data Ingestion:** The **Indexer** connects to an Antelope node's State History Plugin (SHIP) WebSocket endpoint.\n2.  **Processing \u0026 Queuing:** The Indexer deserializes action traces and state deltas, applies filtering (whitelists/blacklists), enriches data, and pushes processed data onto **RabbitMQ** queues.\n3.  **History Storage:** Indexer worker processes consume data from RabbitMQ and index historical action traces and state deltas into **Elasticsearch**. This forms the backbone for historical queries.\n4.  **State Storage:** If configured, Indexer workers (or dedicated sync tools) process deltas or perform full scans to maintain the *current state* of specified accounts, proposals, voters, or contract tables within **MongoDB**.\n5.  **Data Serving:** The **API Server** handles client requests. It queries:\n    *   **Elasticsearch** for historical data (`/v2/history/*`, `/v1/*`).\n    *   **MongoDB** for current state data (`/v2/state/*`).\n    *   **Redis** for cached responses and transaction lookups.\n    *   The **Antelope Node** directly for real-time chain info or as a fallback.\n\n\n## 3. Architecture\n\nA typical Hyperion deployment involves the following components. While they can run on a single machine for smaller chains or development, production environments benefit from distributing them across multiple servers connected via a high-speed network.\n\n### 3.1 Antelope Node (SHIP Enabled)\nThe source of blockchain data. A node (e.g., built from the [AntelopeIO/leap](https://github.com/AntelopeIO/leap) repository) running the `state_history_plugin` provides action traces and state deltas via a WebSocket connection to the Hyperion Indexer.\n\n### 3.2 RabbitMQ\nA robust message queuing system. Used as a buffer and transport layer between the different stages of the Hyperion Indexer (Reader -\u003e Deserializer -\u003e Indexer Workers) and for routing real-time data streams to connected API clients.\n\n### 3.3 Redis\nAn in-memory data store used for:\n*   **API Response Caching:** Temporarily storing results of frequent API queries.\n*   **Preemptive Transaction Caching:** Storing recent transaction details for fast lookups via `v2/history/get_transaction` and `check_transaction`.\n*   **API Usage Statistics:** Tracking API endpoint usage rates.\n*   **Inter-process Communication:** Facilitating coordination, e.g., for rate limiting across clustered API instances (via `@fastify/rate-limit`).\n*   **Live Streaming Coordination:** Used by the Socket.IO Redis adapter for managing stream subscriptions across clustered API instances.\n\n### 3.4 Elasticsearch Cluster\n\nThe primary datastore for **indexed historical data**. It stores processed action traces, state deltas, and block headers.\n*   **Role:** Enables powerful search and aggregation capabilities for historical queries (e.g., `get_actions`, `get_deltas`).\n*   **Requirement:** Essential for all Hyperion history functionalities.\n*   **Recommendation:** Requires significant RAM (32GB+ per node recommended), CPU, and fast storage (SSD/NVMe recommended for ingest nodes, HDDs can be used for cold storage nodes). Multi-node clusters are highly recommended for production.\n\n### 3.5 MongoDB\n\nThis MongoDB integration complements Elasticsearch by focusing on **current state data** rather than historical actions, enabling efficient state queries without scanning history.\n*   **Recommendation:** Requires adequate RAM, CPU, and Disk I/O, particularly if indexing large amounts of contract state.\n\n**System Contract State Storage:**\n- Stores searchable state data for Antelope system contracts like token balances, proposals, and voter information\n- Maintains three primary collections by default:\n    - `accounts`: Stores token balances with indexes for code, scope, and symbol\n    - `proposals`: Tracks governance proposals with detailed approval status\n    - `voters`: Manages staking and voting records with optimized query paths\n\n**Custom Contract State Tracking:**\n\n- Supports operator-defined custom contracts and tables\n- Uses a flexible configuration system to define which contract tables to synchronize\n- Automatically creates appropriate indexes based on contract schemas\n- Stores tables in collections named `{contract}-{table}`\n\n\n**State Synchronization:**\n\n- Enables state synchronization even when starting from snapshots, providing a complete view of the blockchain state\n- Managed through the `hyp-control` CLI tool, allowing for targeted synchronization of specific contracts\n- Maintains block references to track state changes over time\n\n\n**Query Optimization:**\n\n- Creates specialized indexes based on common query patterns\n- Supports advanced query capabilities including MongoDB operators like `$gt`, `$lt`, `$in` for filters\n- Automatically handles date fields for time-based queries\n\n\n**API Integration:**\n\n- Provides dedicated API endpoints for querying state data\n- Supports endpoints like `/v2/state/*` API endpoints\n- Offers flexible filtering options with pagination\n\n\n**Dynamic Contract Schema Support:**\n\n- Either automatically creates indexes based on contract ABIs\n- Or allows for manual index configuration for custom query patterns\n- Supports text search indexes for specific fields when configured\n\n\n### 3.6 Hyperion Indexer\n\nA Node.js application responsible for fetching data from SHIP, deserializing it, processing actions and deltas according to configured filters/handlers, and publishing data to RabbitMQ queues for indexing and state updates. Managed by the [PM2](https://pm2.keymetrics.io/) process manager.\n\n### 3.7 Hyperion API Server\nA Node.js (Fastify framework) application that serves the HTTP API endpoints (v1 and v2). It queries Elasticsearch, MongoDB, Redis, and the Antelope node as needed. It also manages the Swagger documentation UI and handles WebSocket connections for live streaming. Typically run in cluster mode using PM2 for scalability and resilience.\n\n### 3.8 Hyperion Stream Client (Optional)\nA client library (for Web and Node.js) simplifying connection to the real-time streaming endpoints offered by enabled Hyperion providers. See [Stream Client Documentation](https://hyperion.docs.eosrio.io/dev/stream_client/).\n\n### 3.9 Hyperion Plugins (Optional)\nHyperion features an extensible plugin architecture. Plugins can add custom data handlers, API routes, or other functionalities. Managed via the `hpm` command-line tool.\n*   **Example:** [Hyperion Lightweight Explorer](https://github.com/eosrio/hyperion-explorer-plugin)\n\n## 4. Getting Started\n\nFor detailed setup instructions, API usage, and technical deep-dives, please visit the **[Official Hyperion Documentation](https://hyperion.docs.eosrio.io)**.\n\n## 5. API Usage\n\n*   Hyperion exposes a comprehensive **v2 API** for querying history and state.\n*   A **v1 API** (compatible with the legacy `history_plugin`) is also provided.\n*   Interactive API documentation is available via **Swagger UI** at the `/docs` endpoint of your running API server (e.g., `http://your-hyperion-ip:7000/docs`).\n*   Refer to the [API Reference Documentation](https://hyperion.docs.eosrio.io/api/v2/) for details and examples.\n\n## 6. Contributing\n\nWe appreciate community contributions to Hyperion! Here’s how you can help:\n\n*   **Report Bugs:** Find a problem? Please open an [Issue](https://github.com/eosrio/hyperion-history-api/issues) detailing the steps to reproduce it.\n*   **Suggest Enhancements:** Have an idea? Open an Issue or discuss it on our [Hyperion Telegram group](https://t.me/EOSHyperion) first.\n*   **Submit Code:** Pull Requests (PRs) are welcome for bug fixes and improvements. For larger features, please discuss them in an issue or on the [Telegram group](https://t.me/EOSHyperion) beforehand.\n\n## 7. License\n\nHyperion History API is licensed under the [Attribution-NonCommercial-ShareAlike 4.0 International](https://github.com/eosrio/hyperion-history-api/blob/main/license.md).","funding_links":[],"categories":["Developers"],"sub_categories":["Libraries and Frameworks"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feosrio%2FHyperion-History-API","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Feosrio%2FHyperion-History-API","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feosrio%2FHyperion-History-API/lists"}