Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/djwassink/promise-parallel-throttle
It's kinda like Promise.all(), but throttled!
https://github.com/djwassink/promise-parallel-throttle
parallel promise sequential throttle
Last synced: 16 days ago
JSON representation
It's kinda like Promise.all(), but throttled!
- Host: GitHub
- URL: https://github.com/djwassink/promise-parallel-throttle
- Owner: DJWassink
- License: mit
- Created: 2016-10-20T20:16:32.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2024-09-23T22:22:45.000Z (about 2 months ago)
- Last Synced: 2024-10-23T03:35:23.640Z (23 days ago)
- Topics: parallel, promise, sequential, throttle
- Language: TypeScript
- Homepage:
- Size: 256 KB
- Stars: 81
- Watchers: 3
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Promise-parallel-throttle
[![Build Status](https://github.com/DJWassink/Promise-parallel-throttle/actions/workflows/main.yml/badge.svg)](https://travis-ci.org/DJWassink/Promise-parallel-throttle)
[![npm version](https://badge.fury.io/js/promise-parallel-throttle.svg)](https://badge.fury.io/js/promise-parallel-throttle)
[![npm downloads](https://img.shields.io/npm/dm/promise-parallel-throttle.svg)](https://www.npmjs.com/package/promise-parallel-throttle)
[![Bundlephobia](https://badgen.net/bundlephobia/min/promise-parallel-throttle)](https://bundlephobia.com/result?p=promise-parallel-throttle)
[![Bundlephobia](https://badgen.net/bundlephobia/minzip/promise-parallel-throttle)](https://bundlephobia.com/result?p=promise-parallel-throttle)Run a array of Promises in parallel. Kinda like Promise.all(), but throttled!
## Install
### NPM
```bash
npm i promise-parallel-throttle -S
```### Yarn
```bash
yarn add promise-parallel-throttle
```## Usage
```js
import * as Throttle from 'promise-parallel-throttle';//Function which should return a Promise
const doReq = async (firstName, lastName) => {
//Do something async.
return firstName + ' ' + lastName;
};const users = [
{firstName: 'Irene', lastName: 'Pullman'},
{firstName: 'Sean', lastName: 'Parr'},
];//Queue with functions to be run
const queue = users.map((user) => () => doReq(user.firstName, user.lastName));//Default Throttle runs with 5 promises parallel.
const formattedNames = await Throttle.all(queue);console.log(formattedNames); //['Irene Pullman', 'Sean Parr']
```[![Edit Promise-parallel-throttle example](https://codesandbox.io/static/img/play-codesandbox.svg)](https://codesandbox.io/s/4x1943m2v7)
## API
### Throttle.all
`Throttle.all(tasks, options)`
Throttle.all is made to behave exactly like Promise.all but instead of all the tasks running in parallel it runs a maxium amount of tasks in parallel.
Only the tasks parameter is required while the [options](#options-object) parameter is optional.### Throttle.sync
`Throttle.sync(tasks, options)`
Throttle.sync runs all the tasks synchronously.
Once again the tasks array is required, the [options](#options-object) are optional.
Be aware that this method is simply a wrapper to pass `maxInProgress` with 1. So overwriting this option in the options object would run the tasks again in parallel.### Throttle.raw
`Throttle.raw(tasks, options)`
The raw method instead of returning the tasks their results, will return a [result](#result-object--progress-callback) object.
Useful if you wan't more statistics about the execution of your tasks. Once again the tasks are required while the [options](#options-object) are optional.#### Option's Object
| Parameter | Type | Default | Definition |
| :-------------------- | :------- | :---------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------- |
| maxInProgress | Integer | 5 | max amount of parallel threads |
| failFast | Boolean | true (false for the [raw](#throttleraw) method) | reject after a single error, or keep running |
| progressCallback | Function | Optional | callback with progress reports |
| nextCheck | Function | Optional | function which should return a promise, if the promise resolved true the next task is spawn, errors will propagate and should be handled in the calling code |
| ignoreIsFunctionCheck | Boolean | false | If one of the tasks is not a function an error is thrown, if this boolean is set to true we simply return the task itself |#### Result object / Progress callback
The `progressCallback` and the `Raw` will return a `Result` object with the following properties:
| Property | Type | Start value | Definition |
| :--------------------- | :------ | :---------- | :------------------------------------------------------------------------------------------- |
| lastCompletedIndex | Integer | -1 | last index of a task that is completed (either fulfilled or rejected) |
| amountDone | Integer | 0 | amount of tasks which are finished |
| amountStarted | Integer | 0 | amount of tasks which started |
| amountResolved | Integer | 0 | amount of tasks which successfully resolved |
| amountRejected | Integer | 0 | amount of tasks which returned in an error and are aborted |
| amountNextCheckFalsey | Integer | 0 | amount of tasks which got a falsey value in the [nextCheck](#nextcheck) |
| rejectedIndexes | Array | [] | all the indexes in the tasks array where the promise rejected |
| resolvedIndexes | Array | [] | all the indexes in the tasks array where the promise resolved |
| nextCheckFalseyIndexes | Array | [] | all the indexes in the tasks array where the [nextCheck](#nextcheck) returned a falsey value |
| taskResults | Array | [] | array containing the result of every task |#### nextCheck
All the `Throttle` methods have a `nextCheck` method which will be used to verify if a next task is allowed to start.
The default `nextCheck` is defined like this;
```js
const defaultNextTaskCheck = (status, tasks) => {
return new Promise((resolve, reject) => {
resolve(status.amountStarted < tasks.length);
});
};
```This function will get a status object as parameter which adheres to the [Result object](#result-object--progress-callback) and it also receives the list of tasks.
In the default `nextCheck` we simply check if the amount of started exceeds the amount to be done, if not we are free to start an other task.This function can be useful to write your own scheduler based on, for example ram/cpu usage.
Lets say that your tasks use a lot of ram and you don't want to exceed a certain amount.
You could then write logic inside a `nextCheck` function which resolves after there is enough ram available to start the next task.If a custom implementation decides to reject, the error is propagated and should be handled in the user it's code. If a custom implementation returns a falsey value the task will simply not execute and the next task will be scheduled.
## Example
Check out the example's directory, it's heavily documented so it should be easy to follow.
To run the example, at least Node 8.x.x is required, since it supports native async/await.
Simply run the example with npm:
```
npm run-script names
```Or with Yarn:
```
yarn names
```