Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pinax-network/antelope-transactions-api
https://github.com/pinax-network/antelope-transactions-api
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/pinax-network/antelope-transactions-api
- Owner: pinax-network
- License: mit
- Created: 2024-08-28T17:09:39.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-08-29T17:19:27.000Z (4 months ago)
- Last Synced: 2024-08-30T18:18:47.541Z (4 months ago)
- Language: TypeScript
- Size: 608 KB
- Stars: 0
- Watchers: 5
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
# Antelope Transactions API
[![.github/workflows/bun-test.yml](https://github.com/pinax-network/antelope-transactions-api/actions/workflows/bun-test.yml/badge.svg)](https://github.com/pinax-network/antelope-transactions-api/actions/workflows/bun-test.yml)
> Transactions information from the Antelope blockchains, powered by [Substreams](https://substreams.streamingfast.io/)
## Swagger API
### Usage
| Path | Description |
| :------ | ----------- |
| GET `/actions/tx_hash/{tx_hash}` | Actions by transaction |
| GET `/actions/block_number/{block_number}` | Actions by block |
| GET `/actions/block_date/{block_date}` | Actions by date |
| GET `/authorizations/tx_hash/{tx_hash}` | Authorizations by transaction |
| GET `/authorizations/block_number/{block_number}` | Authorizations by block |
| GET `/authorizations/block_date/{block_date}` | Authorizations by date |
| GET `/blocks/hash/{hash}` | Blocks by hash |
| GET `/blocks/number/{number}` | Blocks by number |
| GET `/blocks/date/{date}` | Blocks by date |
| GET `/db_ops/tx_hash/{tx_hash}` | Database operations by transaction |
| GET `/db_ops/block_number/{block_number}` | Database operations by block |
| GET `/db_ops/block_date/{block_date}` | Database operations by date |
| GET `/transactions/hash/{hash}` | Transactions by hash |
| GET `/transactions/block_number/{block_number}` | Transactions by block |
| GET `/transactions/block_date/{block_date}` | Transactions by date |> [!NOTE]
> All endpoints support `first`, `skip`, `order_by`, `order_direction` as additional query parameters.### Docs
| Path | Description |
| :--- | ----------- |
| GET `/openapi` | [OpenAPI](https://www.openapis.org/) specification |
| GET `/version` | API version and Git short commit hash |### Monitoring
| Path | Description |
| :--- | ----------- |
| GET `/health` | Checks database connection |
| GET `/metrics` | [Prometheus](https://prometheus.io/) metrics |### `X-Api-Key`
Use the `Variables` tab at the bottom to add your API key:
Get API key:
```json
{
"X-Api-Key": "PINAX_API_KEY"
}
```### Additional notes
- The response contains pagination and statistics.
## Requirements
- [ClickHouse](clickhouse.com/), databases should follow a `{chain}_transactions_{version}` naming scheme.
- The [`substreams-raw-blocks`](https://github.com/pinax-network/substreams-raw-blocks/) Antelope spkg will be used as data source.
- A [Substream sink](https://substreams.streamingfast.io/reference-and-specs/glossary#sink) for loading data into ClickHouse. We recommend [Substreams Sink SQL](https://github.com/pinax-network/substreams-sink-sql/).### API stack architecture
![API architecture diagram](api_architecture_diagram.png)
### Setting up the database backend (ClickHouse)
#### Without a cluster
Example on how to set up the ClickHouse backend for sinking [EOS](https://pinax.network/en/chain/eos) data.
1. Start the ClickHouse server
```console
clickhouse server
```2. Create the transactions database
```console
echo "CREATE DATABASE eos_transactions_v1" | clickhouse client -h --port 9000 -d -u --password
```3. Run the [`create_schema.sh`](./create_schema.sh) script
```console
./create_schema.sh -o /tmp/schema.sql
```4. Execute the schema
```console
cat /tmp/schema.sql | clickhouse client -h --port 9000 -d -u --password
```5. Run the [sink](https://github.com/pinax-network/substreams-sink-sql)
```console
substreams-sink-sql run clickhouse://:@:9000/eos_transactions_v1 \
https://github.com/pinax-network/substreams-raw-blocks/releases/download/antelope-v0.3.0/raw-blocks-antelope-v0.3.0.spkg `#Substreams package` \
-e eos.substreams.pinax.network:443 `#Substreams endpoint` \
1: `#Block range :` \
--final-blocks-only --undo-buffer-size 1 --on-module-hash-mistmatch=warn --batch-block-flush-interval 100 --development-mode `#Additional flags`
```6. Start the API
```console
# Will be available on locahost:8080 by default
antelope-transactions-api --host --database eos_transactions_v1 --username --password --verbose
```#### With a cluster
If you run ClickHouse in a [cluster](https://clickhouse.com/docs/en/architecture/cluster-deployment), change step 2 & 3:
2. Create the transactions database
```console
echo "CREATE DATABASE eos_transactions_v1 ON CLUSTER " | clickhouse client -h --port 9000 -d -u --password
```3. ~~Run the [`create_schema.sh`](./create_schema.sh) script~~ (SQL Sink should handle creating schema)
```console
./create_schema.sh -o /tmp/schema.sql -c
```4. 5. 6. Follow the same steps as without a cluster.
## [`Bun` Binary Releases](https://github.com/pinax-network/antelope-transactions-api/releases)
> [!WARNING]
> Linux x86 only```console
$ wget https://github.com/pinax-network/antelope-transactions-api/releases/download/v0.3.4/antelope-transactions-api
$ chmod +x ./antelope-transactions-api
$ ./antelope-transactions-api --help
Usage: antelope-transactions-api [options]Transactions information from the Antelope blockchains
Options:
-V, --version output the version number
-p, --port HTTP port on which to attach the API (default: "8080", env: PORT)
--hostname Server listen on HTTP hostname (default: "localhost", env: HOSTNAME)
--host Database HTTP hostname (default: "http://localhost:8123", env: HOST)
--database The database to use inside ClickHouse (default: "default", env: DATABASE)
--username Database user (default: "default", env: USERNAME)
--password Password associated with the specified username (default: "", env: PASSWORD)
--max-limit Maximum LIMIT queries (default: 10000, env: MAX_LIMIT)
-v, --verbose Enable verbose logging (choices: "true", "false", default: false, env: VERBOSE)
-h, --help display help for command
```## `.env` Environment variables
```env
# API Server
PORT=8080
HOSTNAME=localhost# Clickhouse Database
HOST=http://127.0.0.1:8123
DATABASE=default
USERNAME=default
PASSWORD=
MAX_LIMIT=500# Logging
VERBOSE=true
```## Docker environment
- Pull from GitHub Container registry
**For latest tagged release**
```bash
docker pull ghcr.io/pinax-network/antelope-transactions-api:latest
```**For head of `main` branch**
```bash
docker pull ghcr.io/pinax-network/antelope-transactions-api:develop
```- Build from source
```bash
docker build -t antelope-transactions-api .
```- Run with `.env` file
```bash
docker run -it --rm --env-file .env ghcr.io/pinax-network/antelope-transactions-api
```## Contributing
See [`CONTRIBUTING.md`](CONTRIBUTING.md).
### Quick start
Install [Bun](https://bun.sh/)
```console
bun install
bun dev
```**Tests**
```console
bun lint
bun test
```