https://github.com/sindresorhus/chunkify
Split an iterable into evenly sized chunks
https://github.com/sindresorhus/chunkify
array array-chunk array-manipulations javascript npm-package
Last synced: 9 months ago
JSON representation
Split an iterable into evenly sized chunks
- Host: GitHub
- URL: https://github.com/sindresorhus/chunkify
- Owner: sindresorhus
- License: mit
- Created: 2021-01-27T11:02:02.000Z (almost 5 years ago)
- Default Branch: main
- Last Pushed: 2023-12-27T16:31:24.000Z (almost 2 years ago)
- Last Synced: 2025-03-29T05:25:11.306Z (9 months ago)
- Topics: array, array-chunk, array-manipulations, javascript, npm-package
- Language: JavaScript
- Homepage:
- Size: 9.77 KB
- Stars: 154
- Watchers: 2
- Forks: 7
- Open Issues: 0
-
Metadata Files:
- Readme: readme.md
- License: license
Awesome Lists containing this project
README
# chunkify
> Split an iterable into evenly sized chunks
## Install
```sh
npm install chunkify
```
## Usage
```js
import chunkify from 'chunkify';
console.log([...chunkify([1, 2, 3, 4], 2)]);
//=> [[1, 2], [3, 4]]
console.log([...chunkify([1, 2, 3, 4], 3)]);
//=> [[1, 2, 3], [4]]
```
## API
### chunkify(iterable, chunkSize)
Returns an iterable with the chunks. The last chunk could be smaller.
#### iterable
Type: `Iterable` *(for example, `Array`)*
The iterable to chunkify.
#### chunkSize
Type: `number` *(integer)*\
Minimum: `1`
The size of the chunks.
## Use-cases
### Batch processing
When dealing with large datasets, breaking data into manageable chunks can optimize the batch processing tasks.
```js
import chunkify from 'chunkify';
const largeDataSet = [...Array(1000).keys()];
const chunkedData = chunkify(largeDataSet, 50);
for (const chunk of chunkedData) {
processBatch(chunk);
}
```
### Parallel processing
Dividing data into chunks can be useful in parallel processing to distribute workload evenly across different threads or workers.
```js
import {Worker} from 'node:worker_threads';
import chunkify from 'chunkify';
const data = [/* some large dataset */];
const chunkedData = chunkify(data, 20);
for (const [index, chunk] of chunkedData.entries()) {
const worker = new Worker('./worker.js', {
workerData: {
chunk,
index
}
});
}
```
### Network requests
Splitting a large number of network requests into chunks can help in managing the load on the network and preventing rate limiting.
```js
import chunkify from 'chunkify';
const urls = [/* Array of URLs */];
const chunkedUrls = chunkify(urls, 10);
for (const chunk of chunkedUrls) {
await Promise.all(chunk.map(url => fetch(url)));
}
```