https://github.com/littletof/deno_bencher
https://github.com/littletof/deno_bencher
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/littletof/deno_bencher
- Owner: littletof
- Created: 2020-06-05T15:38:52.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2020-06-06T20:25:22.000Z (almost 5 years ago)
- Last Synced: 2025-02-05T21:02:21.259Z (3 months ago)
- Language: TypeScript
- Size: 10.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# deno_bencher
Makes it easier to handle your benchmarks, run them on CI and save the results into your repo
## Usage
Install as a script with deno install, use in a CI job like [this](./.github/workflows/benchmarks.yml), or run it by hand.
```batch
deno run -A --unstable --allow-hrtime https://deno.land/x/gh:littletof:deno_bencher/run_benchmarks.ts --benches=./benchmarks/benchmarks.ts --json=./benchmarks/benchmarks.json --formatted=./benchmarks/benchmarks.txt -s -os -dv -date -metrics
```This will look for the file `./benchmarks/benchmarks.ts` (from `--benches` flag) dynamically import it and call `prepareBenchmarks()` on it if it exists. In it you can add all your benches that you want to run.
After that it will execute all the benchmarks and than write its results into `./benchmarks/benchmarks.json` (from `--json` flag).
If your module has a `formatResults(results: BenchmarkRunResults): string` method, it will call it, than save its string result into `./benchmarks/benchmarks.txt` (from `--formatted` flag)
## Options
- `--benches` : a path where the module having `prepareBenchmarks` and `formatResults` is. (default: `./benchmarks/benchmarks.ts`)
- `--json` : a path where the bare results should be saved. (default: `./benchmarks/benchmarks.json`)
- `--formatted` : a path where the formatted results should be saved. Only used if `formatResults` method exists in module (default: `./benchmarks/benchmarks.txt`)
- `-s` : makes `runBenchmarks` to run with `silent` flag
- `-os` : puts `Deno.build` into the results `os` property
- `-dv` : puts `Deno.version` into the results `deno_v` property
- `-date` : puts the date into the results `date` property
- `-metrics` : puts `Deno.metrics()` into the results `metrics` property## Restrict permissions of the script
It needs `--unstable` (at least right now), and `--allow-net` because the dynamically imported module needs to download its imports. Setting `--allow-hrtime` makes the measurement more precise.
```batch
deno run --allow-hrtime --allow-read=./benchmarks --allow-write=./benchmarks/benchmarks.json,./benchmarks/benchmarks.txt --unstable --allow-net https://deno.land/x/gh:littletof:deno_bencher/run_benchmarks.ts --json=./benchmarks/benchmarks.json --formatted=./benchmarks/benchmarks.txt -s -os -dv -date -metrics
```## As a Github Action / CI job
This script can be easily used to keep track of your code's performance during developement. Run this script, than push the results into your repo with another action, like [this](https://github.com/marketplace/actions/add-commit). There is an example of a Github Action that does this in this repo [here](./.github/workflows/benchmarks.yml)
# TODOS
- [ ] Option to turn off default .json save and use only formatted
- [ ] Format the measuredRunsMs array into a more compact format
- [ ] make into a proper Github Action
- [ ] add script to [deno.land/x/](https://deno.land/x)
- [ ] Tidy up# deno_bencherMakes it easier to handle your benchmarks, run them on CI and save the results into your repo
## Usage
Install as a script with deno install, use in a CI job like [this](./.github/workflows/benchmarks.yml), or run it by hand.
```batch
deno run -A --unstable --allow-hrtime https://deno.land/x/gh:littletof:deno_bencher/run_benchmarks.ts --benches=./benchmarks/benchmarks.ts --json=./benchmarks/benchmarks.json --formatted=./benchmarks/benchmarks.txt -s -os -dv -date -metrics
```This will look for the file `./benchmarks/benchmarks.ts` (from `--benches` flag) dynamically import it and call `prepareBenchmarks()` on it if it exists. In it you can add all your benches that you want to run.
After that it will execute all the benchmarks and than write its results into `./benchmarks/benchmarks.json` (from `--json` flag).
If your module has a `formatResults(results: BenchmarkRunResults): string` method, it will call it, than save its string result into `./benchmarks/benchmarks.txt` (from `--formatted` flag)
## Options
- `--benches` : a path where the module having `prepareBenchmarks` and `formatResults` is. (default: `./benchmarks/benchmarks.ts`)
- `--json` : a path where the bare results should be saved. (default: `./benchmarks/benchmarks.json`)
- `--formatted` : a path where the formatted results should be saved. Only used if `formatResults` method exists in module (default: `./benchmarks/benchmarks.txt`)
- `-s` : makes `runBenchmarks` to run with `silent` flag
- `-os` : puts `Deno.build` into the results `os` property
- `-dv` : puts `Deno.version` into the results `deno_v` property
- `-date` : puts the date into the results `date` property
- `-metrics` : puts `Deno.metrics()` into the results `metrics` property## Restrict permissions of the script
It needs `--unstable` (at least right now), and `--allow-net` because the dynamically imported module needs to download its imports. Setting `--allow-hrtime` makes the measurement more precise.
```batch
deno run --allow-hrtime --allow-read=./benchmarks --allow-write=./benchmarks/benchmarks.json,./benchmarks/benchmarks.txt --unstable --allow-net https://deno.land/x/gh:littletof:deno_bencher/run_benchmarks.ts --json=./benchmarks/benchmarks.json --formatted=./benchmarks/benchmarks.txt -s -os -dv -date -metrics
```## As a Github Action / CI job
This script can be easily used to keep track of your code's performance during developement. Run this script, than push the results into your repo with another action, like [this](https://github.com/marketplace/actions/add-commit). There is an example of a Github Action that does this in this repo [here](./.github/workflows/benchmarks.yml)
# TODOS
- [ ] Option to turn off default .json save and use only formatted
- [ ] Format the measuredRunsMs array into a more compact format
- [ ] make into a proper Github Action
- [ ] add script to [deno.land/x/](https://deno.land/x)
- [ ] Tidy up