Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/im-beast/http_benchmarks

🏎️ Deno web frameworks raced against each other
https://github.com/im-beast/http_benchmarks

benchmark benchmarking deno http performance speed test testing web web-framework

Last synced: 16 days ago
JSON representation

🏎️ Deno web frameworks raced against each other

Awesome Lists containing this project

README

        

# HTTP Benchmarks

### 🏎️ List of web frameworks for [Deno](https://deno.land/) raced against each other

Check benchmarks here:

| Name | Description |
| :----------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [Hello world](./results/hello_world/README.md) | Server responds with simple `"hello world"` text |
| [File server](./results/file_server/README.md) | Static file server, where server needs to respond with files in given directory |
| [Multiple routes](./results/multiple_routes/README.md) | Server with multiple routes:

  • `/hello_world (GET)` – returns `"hello world"`
  • `/random_number (GET)` – returns random number from 0 to 1
  • `plus_1 (POST)` – adds 1 to the counter

  • `minus_1 (POST)` – subtracts 1 from the counter

  • `count (GET)` – return counter value
|

More benchmarks are planned!
However if you have an idea for real-world scenario that frameworks speed should be
tested for, please create an issue about it!

## 🏁 Benchmarking

Benchmarks are done on my PC!
This is because `Github Actions` is too slow to unleash the beast from some of the
frameworks and provides unreliable results.

My CPU is an `AMD Ryzen 7 5700G` and is cooled well enough where temps don't exceed 55Β°C during heavy-use.
I make
sure that nothing is running in the background while doing benchmarks.

Benchmarks are run after:

- Every update to any of the modules
- Change has been made to source code of benchmarking tool
- New benchmarks have been added
- New Deno version is out

## 🀝 Contributing

**http_benchmarks** is open for any contributions.

If you feel like you can enhance this project - please open an issue and/or pull request.

This project follows [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) spec.

Benchmark code should look like so:
_Description is optional, you can leave it as empty string._

```ts
export const NAME = "";
export const DESCRIPTION = "";
export const VERSION = "";

if (import.meta.main) {
// ...your framework code
}
```

## πŸ™ To dear web framework developers

If you can, please contribute to this project by creating a pull request where you add code for benchmarks for your
framework.
I should be able to update versions of them myself from that time, and I'll try to add as many
benchmarks as I can.
However that'd make it a lot easier, since you know how to write code for that framework better (and faster!).

## πŸ“ Licensing

This project is available under **MIT** License conditions.