https://github.com/yglukhov/yasync
https://github.com/yglukhov/yasync
Last synced: 7 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/yglukhov/yasync
- Owner: yglukhov
- License: mit
- Created: 2022-07-27T15:04:18.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2024-10-06T11:50:15.000Z (about 1 year ago)
- Last Synced: 2024-10-14T15:04:32.462Z (12 months ago)
- Language: Nim
- Size: 61.5 KB
- Stars: 32
- Watchers: 6
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# yasync - Yet another async/await for Nim
[](https://github.com/yglukhov/yasync/actions?query=branch%3Amain) [](https://nimble.directory/pkg/yasync)
**Requires nim devel or 2.0.**
- Semantics is very close to that of Nim's std `async`, `await` and `Future[T]`.
- `await` operation doesn't cause any heap allocations (except calling manually-async functions, async closures, and (mutually) recursive calls).
- `callSoon` is not used and async code is guaranteed to run atomically between `await`s across call boundaries (TODO: maybe add an example)
- Function return type is written as `T`, then `async` transforms it to `Future[T]` implicitly.
- Provides optional compatibility layer for interop with existing `asyncdispatch` (and TODO: chronos) code. [Example](https://github.com/yglukhov/yasync/blob/main/tests/test4.nim). The library itself is completely dispatcher agnostic, and doesn't depend on neither `asyncdispatch`, nor chronos.This library introduces `async`, `await` and `Future[T]` similar in semantics to Nim's native ones, but implements an optimization to avoid heap allocations. Consider the following sample:
```nim
proc sleep(ms: int): Future[void] = ... # Let's assume this function doesn't allocateproc foo(a: int): int {.async.} =
await sleep(a)
var b = a + 1
return bproc bar() {.async.} =
let a = await foo(5)
let b = await foo(6)
echo a + bwaitFor bar()
```If we pretend that `sleep` doesn't allocate, the whole `waitFor` operation above will not do a single allocation. The optimization happens in `async` functions on `await` calls.
TODO: Describe how it works in details
TODO:
- [x] Recursive calls
- [x] Generics
- [ ] Cancellation
- [ ] Nice stack traces