An open API service indexing awesome lists of open source software.

https://github.com/jdalrymple/sema4

Promise based Semaphores
https://github.com/jdalrymple/sema4

Last synced: 8 months ago
JSON representation

Promise based Semaphores

Awesome Lists containing this project

README

          

# sema4


pipeline status
coverage report

Code Climate maintainability


Auto


All Contributors

Prettier

Licence: MIT

> A semaphore implementation using promises. Forked from [vercel/async-sema](https://github.com/vercel/async-sema/).

## Table of Contents

- [Usage](#usage)
- [API](#api)
- [Examples](#examples)
- [Contributors](#contributors)
- [Changelog](./CHANGELOG.md)

## Features

- **Universal** - Works in all modern browsers, [Node.js](https://nodejs.org/), and [Deno](https://deno.land/) and supports [CLI](https://www.npmjs.com/package/@gitbeaker/cli) usage.
- **Zero Dependencies** - Absolutely no dependencies, keeping the package tiny (24kb).
- **Tested** - Greater than 98% test coverage.
- **Typed** - Out of the box TypeScript declarations.

## Usage

Browsers

Load sema4 directly from esm.sh

```html

import { Sema } from 'https://esm.sh/sema4';

```

Deno

Load sema4 directly from esm.sh

```ts
import { Sema } from 'https://esm.sh/sema4?dts';
```

Node 18+

Install with npm install sema4, or yarn add sema4

```js
import { Sema } from 'sema4';
```

## API

### Sema

#### Constructor(maxConcurrency, { initFn, pauseFn, resumeFn, capacity })

| Name | Type | Optional | Default | Description |
| ------------------ | -------- | -------- | -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `maxConcurrency` | Integer | No | `https://gitlab.com` | The maximum number of callers allowed to acquire the semaphore concurrently |
| `options.initFn` | Function | Yes | `() => '1'` | The function that is used to initialize the tokens used to manage the semaphore |
| `options.pauseFn` | Function | Yes\* | | The function that is called to opportunistically request pausing the incoming stream of data, instead of piling up waiting promises and possibly running out of memory |
| `options.resumeFn` | Function | Yes\* | N/A | The function that is called when there is room again to accept new waiters on the semaphore. This function must be declared if a `pauseFn` is declared |
| `options.capacity` | Integer | Yes | 10 | Sets the size of the pre-allocated waiting list inside the semaphore. This is typically used by high performance where the developer can make a rough estimate of the number of concurrent users of a semaphore. |

#### `async sema.drain()`

Drains the semaphore and returns all the initialized tokens in an array. Draining is an ideal way to ensure there are no pending async tasks, for example before a process will terminate.

#### `sema.waiting()`

Returns the number of callers waiting on the semaphore, i.e. the number of pending promises.

#### `sema.tryAcquire()`

Attempt to acquire a token from the semaphore, if one is available immediately. Otherwise, return `undefined`.

#### `async sema.acquire()`

Acquire a token from the semaphore, thus decrement the number of available execution slots. If `initFn` is not used then the return value of the function can be discarded.

#### `sema.release(token)`

Release the semaphore, thus increment the number of free execution slots. If `initFn` is used then the `token` returned by `acquire()` should be given as an argument when calling this function.

### Rate Limit

#### Constructor(rate, { interval, uniformDistribution })

Creates a rate limit instance.

| Name | Type | Optional | Default | Description |
| ----------------------------- | ------- | -------- | ------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `rate` | Integer | No | | Number of tasks allowed per `interval` |
| `options.interval` | Integer | Yes | 1000 | Defines the width of the rate limiting window in milliseconds |
| `options.uniformDistribution` | Boolean | Yes | False | Enforces a discrete uniform distribution over time. Setting the `uniformDistribution` option is mainly useful in a situation where the flow of rate limit function calls is continuous and and occurring faster than `interval` (e.g. reading a file) and not enabling it would cause the maximum number of calls to resolve immediately (thus exhaust the limit immediately) and therefore the next bunch of calls would need to wait for `interval` milliseconds. However if the flow is sparse then this option may make the code run slower with no advantages. |

#### `async rateLimit.apply()`

Acquires a semaphore and connects a timeout for its release. If the rate limit is reached, the execution process is halted until an available semaphore is released.

#### `rateLimit.reset()`

Releases all acquired semaphores immediately and resets the timeouts connected to them.

## Examples

```js
import { Sema } from 'sema4';

function foo() {
const s = new Sema(
4, // Allow 4 concurrent async calls
{
capacity: 100, // Preallocated space for 100 tokens
},
);

async function fetchData(x) {
await s.acquire();

try {
console.log(s.waiting() + ' calls to fetch are waiting');
// Perform some async tasks here...
} finally {
s.release();
}
}

return Promise.all(array.map(fetchData));
}
```

```js
import { RateLimit } from 'sema4';

async function bar() {
const rl = new RateLimit(5); // Limit to 5 tasks per default time interval

for (let i = 0; i < n; i++) {
await rl.apply();
// Perform some async tasks here...
}
}
```

## Contributors

In addition to the contributors of the parent repository [vercel/async-sema](https://github.com/vercel/async-sema), these lovely people have helped keep this library going.



Justin Dalrymple