https://github.com/stringparser/batch-these
batch with ease
https://github.com/stringparser/batch-these
Last synced: 2 months ago
JSON representation
batch with ease
- Host: GitHub
- URL: https://github.com/stringparser/batch-these
- Owner: stringparser
- License: other
- Created: 2014-09-29T18:48:29.000Z (over 11 years ago)
- Default Branch: master
- Last Pushed: 2014-10-09T15:25:06.000Z (over 11 years ago)
- Last Synced: 2025-02-17T15:49:54.642Z (11 months ago)
- Language: JavaScript
- Homepage:
- Size: 258 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: readme.md
- License: LICENSE
Awesome Lists containing this project
README
# batch-these [
](https://github.com/fehmicansaglam/progressed.io)
[
](https://travis-ci.org/stringparser/batch-these/builds)
[
](http://www.npmjs.org/package/batch-these)
batch data with ease
## install
npm install --save batch-these
## example
```js
var batch = require('batch-these');
batch.wait(10); // 10 miliseconds tops
process.on('stuff-started', function(e){
var chunk = e.name;
batch.these(chunk, function(data){
console.log('Started ', data.join(', ') );
});
});
process.on('stuff-done', function(e){
var chunk = e.name + ' in ' + Math.floor(e.time) + ' ms';
batch.these(chunk, function(data){
console.log('Done with Mr.', data.join(', Mr. ') );
});
});
var dogs = ['Blue', 'Pink', 'Eddie', 'Joe','White','Brown', 'Blonde','Orange'];
dogs.forEach(function(name, index){
var time = process.hrtime();
setTimeout(function(){
process.emit('stuff-started', {
name : name,
time : time
});
var rand = Math.floor(Math.random()*100);
setTimeout(function(){
process.emit('stuff-done', {
name : name,
time : process.hrtime(time)[1]/1000000
});
}, rand);
}, (index + 1)*11);
});
```
which will output something similar to
```
Started Blue, Pink, Eddie, Joe, White
Done with Mr. Pink in 31 ms
Started Brown
Done with Mr. Joe in 20 ms, Mr. Brown in 3 ms, Mr. Eddie in 37 ms, Mr. Blue in 59 ms
Started Blonde
Done with Mr. White in 20 ms
Started Orange
Done with Mr. Blonde in 15 ms
Done with Mr. Orange in 44 ms
```
### documentation
`var batch = require('batch-these')`
#### batch.these(chunk, callback)
`chunk`
type: none | default: none
Data to be accumulated.
`function` to pass the data when the time comes.
#### batch.store([callback])
How to store your chunks. This is the default
```js
function batchStore(batch, chunk){
batch.data = batch.data || [ ];
batch.data.push(chunk);
};
```
`batch.store()` returns the current `storer`.
#### batch.filter([callback])
Decides how the chunks are accumulated. The default is
```js
function batchFilter(batch, caller){
return batch.location === caller.location;
};
```
Where location has the stack format
`filename:lineNumber:columNumber`
`batch.filter()` returns the current filter.
#### batch.wait([ms])
ms
type: `number` | default: `0` miliseconds
Time in `ms` to wait in between batches.
#### batch.origin([handle])
handle
type: `function` | default: `console.log`
Function to track down for the batches.
### how it works
Internally is using [callers-module](https://github.com/stringparser/callers-module) to get only 1 *stacktrace* frame. With that frame one can figure out *the exact location* of the `callback`. Based on that, a batch is stored. For each *location* a batch will kept waiting for a new chunk using a timer. Thats it.
The time to be waiting is set with `batch.wait([ms])` time.
The origin from which the *stacktrace* will be taken is set with `batch.origin([handle])`
NOTE: the package is devised to work hand in hand with `process.stdout.write`. That is, the package [*monkeypatches*](https://github.com/stringparser/stdout-monkey) `stdout` in order to feed from its data.
Though it would need some changes as it is, it should work with any other function call. With a prior patch, that is.
## why
You would like to keep `stdout` writes to the bare minimum.
## test
npm test
### license
[
](http://opensource.org/licenses/MIT)