https://github.com/ovhemert/elasticsearch-batch-stream
A write stream that creates batches of elasticsearch bulk operations
https://github.com/ovhemert/elasticsearch-batch-stream
batch bulk chunk elastic elasticsearch stream
Last synced: 3 months ago
JSON representation
A write stream that creates batches of elasticsearch bulk operations
- Host: GitHub
- URL: https://github.com/ovhemert/elasticsearch-batch-stream
- Owner: ovhemert
- License: mit
- Created: 2019-03-22T08:35:59.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2023-03-02T21:14:04.000Z (over 2 years ago)
- Last Synced: 2025-07-09T17:13:20.444Z (3 months ago)
- Topics: batch, bulk, chunk, elastic, elasticsearch, stream
- Language: JavaScript
- Size: 390 KB
- Stars: 5
- Watchers: 2
- Forks: 1
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- Contributing: docs/CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: docs/CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# elasticsearch-batch-stream
[](https://travis-ci.com/ovhemert/elasticsearch-batch-stream)
[](https://www.codacy.com/app/ovhemert/elasticsearch-batch-stream?utm_source=github.com&utm_medium=referral&utm_content=ovhemert/elasticsearch-batch-stream&utm_campaign=Badge_Grade)
[](https://snyk.io/test/npm/elasticsearch-batch-stream)
[](https://coveralls.io/github/ovhemert/elasticsearch-batch-stream?branch=master)
[](http://standardjs.com/)A write stream that creates batches of elasticsearch bulk operations.
## Example
The ElasticSearch library has a function to [bulk](https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html#api-bulk) write documents, but since a stream emits a write for each document, we cannot group multiple operations together.
This package wraps the `bulk` function in a writestream to help buffer the operations and passing them on as batches to the bulk function. For example, we can now create batches of 500 docs each and reduce the number of API calls to ElasticSearch from 100.000 to 200, which will improve speed.
```js
const docTransformStream = through2.obj(function (chunk, enc, callback) {
// convert chunk => doc
const doc = { index: 'myindex', type: 'mytype', id: '12345', action: 'index', doc: { name: 'test' } }
callback(null, doc)
})sourceReadStream().pipe(docTransformStream()).pipe(bulkWriteStream({ client, size: 500 }))
```## Installation
```bash
$ npm install elasticsearch-batch-stream
```## API
bulkWriteStream(options = { client, size })
Creates the write stream to ElasticSearch.
### options
The options object argument is required and should at least include the ElasticSearch client object.
#### client
An instance of the ElasticSearch client i.e. `new elasticsearch.Client()`
#### size
Number of stream operations to group together in the bulk command (default = 100).
## Maintainers
Osmond van Hemert
[](https://github.com/ovhemert)
[](https://ovhemert.dev)## Contributing
If you would like to help out with some code, check the [details](./docs/CONTRIBUTING.md).
Not a coder, but still want to support? Have a look at the options available to [donate](https://ovhemert.dev/donate).
## License
Licensed under [MIT](./LICENSE).