{"id":19370028,"url":"https://github.com/bustle/bluestream","last_synced_at":"2025-05-08T17:22:08.040Z","repository":{"id":23608921,"uuid":"98933738","full_name":"bustle/bluestream","owner":"bustle","description":"A collection of streams that work well with promises (through, map, reduce). Think Through2 with promises","archived":false,"fork":false,"pushed_at":"2022-05-01T00:55:57.000Z","size":317,"stargazers_count":102,"open_issues_count":4,"forks_count":5,"subscribers_count":25,"default_branch":"master","last_synced_at":"2025-04-13T03:03:04.944Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/bustle.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-07-31T22:03:03.000Z","updated_at":"2025-02-28T03:02:19.000Z","dependencies_parsed_at":"2022-07-27T04:02:29.359Z","dependency_job_id":null,"html_url":"https://github.com/bustle/bluestream","commit_stats":null,"previous_names":[],"tags_count":36,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bustle%2Fbluestream","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bustle%2Fbluestream/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bustle%2Fbluestream/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bustle%2Fbluestream/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/bustle","download_url":"https://codeload.github.com/bustle/bluestream/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253112365,"owners_count":21856124,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-10T08:14:02.175Z","updated_at":"2025-05-08T17:22:08.013Z","avatar_url":"https://github.com/bustle.png","language":"TypeScript","readme":"# bluestream 🏄‍♀️\n\n[![Build Status](https://travis-ci.org/bustle/bluestream.svg?branch=master)](https://travis-ci.org/bustle/bluestream) [![Try bluestream on RunKit](https://badge.runkitcdn.com/bluestream.svg)](https://npm.runkit.com/bluestream)\n\n\nBluestream is a collection of NodeJS Streams and stream utilities that work well with promises and async functions. Think `through2-concurrent` with promise support. The goal is to reduce the edge cases when mixing streams and promises. In general, Promises are slower than callbacks, but these streams are a lot more forgiving than node core.\n\nIf you don't need streams but want to work with data over time, check out sister project [`streaming-iterables 🏄‍♂️`](https://www.npmjs.com/package/streaming-iterables)!\n\nWritten in typescript, designed in NYC.\n\n## Install\nThere are no dependencies.\n\n```bash\nnpm install bluestream\n```\n\n# Examples\n\n```js\nimport { read, transform, write, pipe } from 'bluestream'\nimport got from 'got'\n\n// paginate an API\nconst pokeStream = read(async function () {\n  this.offset = this.offset || 0\n  const { body: { results } } = await got(`https://pokeapi.co/api/v2/pokemon/?offset=${this.offset}`, { json: true })\n  if (results.length \u003e 0) {\n    this.offset += results.length\n    for (const monster of results) {\n      this.push(monster)\n    }\n  } else {\n    return null\n  }\n})\n\nconst fetchMonsterInfo = transform({ concurrent: 2 }, async ({ url }) =\u003e {\n  const { body } = await got(url, { json: true })\n  return body\n})\n\nconst logStream = write(pokemon =\u003e {\n  console.log(`\u003ch1\u003e${pokemon.name}\u003c/h1\u003e\u003cimg src=\"${pokemon.sprites.front_default}\"\u003e`)\n})\n\nawait pipe(\n  pokeStream,\n  fetchMonsterInfo,\n  logStream\n)\nconsole.log('caught them all')\n```\n\n# api\n\n- [`read()`](#read)\n- [`ReadStream()`](#readstream)\n- [`transform()`](#transform-alias-map) (alias `map`)\n- [`TransformStream()`](#transformstream)\n- [`write()`](#write)\n- [`WriteStream()`](#writestream)\n- [`filter()`](#filter)\n- [`reduce()`](#reduce)\n- [`tap()`](#tap)\n- [`batch()`](#batch)\n- [`wait()`](#wait)\n- [`pipe()`](#pipe)\n- [`collect()`](#collect)\n- [`readAsync()`](#readasync)\n- [`iterate()`](#iterate)\n- [`promise()`](#promise)\n\n## read\n\n`([opts:Options,] fn:(bytesWanted) =\u003e Promise)) =\u003e ReadStream`\n\n\n## ReadStream\n\nCreates a read-promise stream which accepts a function that takes the number of bytes or objects of wanted data as arguments and uses `this.push` or `return` to push values or promises. This function should return a promise that indicates when the object/chunk is fully processed. Return or push `null` to end the stream.\n\nOptions:\n  * `read` - An optional way to pass the read function\n\n  * `objectMode` - true or false\n\n  * all other `Readable` stream options\n\nThe other options are also passed to node's Read stream constructor.\n\nA `ReadStream` works like a normal `ReadableStream` but the `_read` and `push()` methods have some noteworthy differences. (The `_read` method can be provided as the only argument, in a `read` key on the options, or as the `_read` method if you extend `ReadStream`.) Any returned, non-undefined value will automatically be pushed. Object mode is the default.\n\n`_read(bytesWanted)`\n- Is async function friendly, handles throws/rejects as error events\n- Is called again only after it returns or resolves regardless of how many times you call `.push`\n- Is called again if you don't push (to aid in control flow)\n- Pushes any non `undefined` return values\n\n`this.push()`\n- Can be called in a promise, which will be resolved and then pushed as normal\n- Returns true or false like a normal stream's push\n\nThis allows you to use it in some friendly ways:\n\n```js\n// readable stream from an array\nconst list = [1, 2, 3]\nconst listStream = bstream.read(() =\u003e list.shift() || null)\n\n// readable stream from redis scans\nimport Redis from 'io-redis'\nconst redis = new Redis()\nlet cursor = 0\n\nconst hscanStream = bstream.read(async () =\u003e {\n  const [newCursor, keys] = await redis.scan('cursor', cursor)\n  keys.map(key =\u003e this.push(key))\n  if (newCursor === '0') {\n    this.push(null)\n  }\n  cursor = newCursor\n})\n```\n\n## transform (alias map)\n\n`transform([opts:Options,] fn:(data[, enc]) =\u003e Promise)): TransformStream`\n\n`map([opts:Options,] fn:(data[, enc]) =\u003e Promise)): TransformStream`\n\n## TransformStream\n\nCreates a transform-promise stream which accepts a function that takes data and\nencoding as arguments and uses `this.push` to push values or promises. Any returned, non-undefined value will automatically be pushed. This function should return a promise that indicates when the object/chunk is fully processed.\n\nOptions:\n  * `transform` - An optional way to pass the transform function\n\n  * `concurrent` - The maximum number of concurrent promises that are allowed.\n    When this limit is reached, the stream will stop processing data and will\n    start buffering incoming objects. Defaults to `1`\n\n  * `highWatermark` - the size (in objects) of the buffer mentioned above. When\n    this buffer fills up, the backpressure mechanism will activate. It's passed\n    to node's transform stream.\n\nThe other options are also passed to node's Transform stream constructor.\n\n## write\n\n`write([opts:Options,] fn:(data[, enc]) =\u003e Promise)): WriteStream`\n\n## WriteStream\n\n`new WriteStream(inputOpts: IWritableStreamOptions | writeFunction, fn?: writeFunction): WriteStream`\n\nCreates a write-promise stream which accepts a function that takes data and encoding as arguments and returns a promise that indicates when the object/chunk is fully processed.\n\nOptions:\n  * `write` - An optional way to pass the write function\n\n  * `writev` - Not supported, and passed directly to the underlying `Writable` stream\n\n  * `concurrent` - The maximum number of concurrent promises that are allowed.\n    When this limit is reached, the stream will stop processing data and will\n    start buffering incoming objects. Defaults to `1`\n\n  * `highWatermark` - the size (in objects) of the buffer mentioned above. When\n    this buffer fills up, the backpressure mechanism will activate. It's passed\n    to node's write stream.\n\nThe other options are also passed to node's Write stream constructor.\n\n## filter\n\n`filter([opts:Options,] fn: async (data[, enc]) =\u003e boolean): FilterStream`\n\nCreates a new FilterStream which accepts a function that takes data and encoding as arguments and returns a boolean to\nindicate whether the data value should pass to the next stream\n\nOptions: Same as `transform`\n\n## reduce\n\n`reduce([opts:Options,] fn: (acc, data[, enc]) =\u003e Promise): ReduceStream`\n\nCreates a new ReduceStream which accepts a function that takes the resolved\ncurrent accumulator, data object, and encoding as arguments and returns the next accumulator\nor a promise for the next accumulator.\n\nThe ReduceStream has a `promise()` method which returns the final\naccumulator value\n\n```js\nprocess.stdin.pipe(split()).pipe(es.reduce(function(acc, el) {\n    return acc + el;\n})).promise().then(function(sum) {\n\n});\n```\n\n## tap\n\n```ts\ntap(opts?: ITransformStreamOptions | ITapFunction, fn?: ITapFunction) =\u003e TapStream\nnew TapStream(opts?: ITransformStreamOptions | ITapFunction, tapFunction?: ITapFunction)\n```\n\nA passthrough stream that intercepts data and lets you process it. Supports async tap functions which will delay processing. Supports `concurrent` if you need it.\n\n```ts\nimport { pipe, tap, write } from 'bluestream'\nimport { ghoulGenerator, saveGhoul } from './util'\n\nawait pipe(\n  ghoulGenerator(),\n  tap(console.log),\n  write(ghoul =\u003e saveGhoul(ghoul))\n)\n// Ghoul(1)\n// Ghoul(2)\n// Ghoul(3)\n// ... 👻\n```\n\n## batch\n```ts\nbatch(batchSize: number) =\u003e BatchStream\nnew BatchStream(batchSize: number)\n```\n\nA stream that collects a given number of objects and emits them in an array.\n\n```ts\nimport { batch, pipe, write } from 'bluestream'\nimport { turkeyGenerator } from './util'\n\nawait pipe(\n  turkeyGenerator(),\n  batch(2),\n  write(console.log)\n)\n// [turkey, turkey]\n// [turkey, turkey]\n// [turkey, turkey]\n// [turkey, turkey]\n// ... 🐧🐧\n```\n\n## wait\n\n`wait(stream: Stream): Promise\u003cany\u003e`\n\nWaits for the stream to end. Rejects on errors. If the stream has a `.promise()` method, it will resolve that value, e.g., from [reduce](#reduce).\n\n## pipe\n\n`pipe(readable: Readable, ...writableStreams: Writable[]): Promise\u003cany\u003e;`\n\nPipes readable to writableStreams and forwards all errors to the resulting promise. The promise resolves when the destination stream ends. If the last writableStream has a `.promise()` method, it is resolved. If the last stream is a reduce stream the final value is resolved.\n\nGeneric Pipe example\n```ts\nimport { pipe, read, write } from 'bluestream'\nconst values = [1, 2, 3, null]\nawait pipe(\n  read(() =\u003e values.shift()),\n  write(number =\u003e console.log(number))\n)\n\n```\n\nPipe example with reduce\n```ts\nimport { pipe, read, reduce } from 'bluestream'\nconst values = [1, 2, 3, null]\nconst sum = await pipe(\n  read(() =\u003e values.shift()),\n  reduce((total, value) =\u003e total + value, 0)\n)\nconsole.log(sum)\n// 6\n```\n\n## collect\n\n`collect(stream: Readable): Promise\u003cnull | string | any[] | Buffer\u003e`\n\nReturns a Buffer, string or array of all the data events concatenated together. If there are no events, null is returned.\n\n```ts\nimport { collect, read } from 'bluestream'\nawait collect(fs.readStream('file'))\n// \u003cBuffer 59 6f 75 20 61 72 65 20 63 6f 6f 6c 21\u003e\nawait collect(fs.readStream('file', 'utf8'))\n// 'You are cool!'\nconst values = [1, 2, 3, null]\nawait collect(read(() =\u003e values.shift()))\n// [1, 2, 3]\nawait collect(read(() =\u003e null))\n// null\n```\n\n## readAsync\n\n`readAsync(stream: Readable, count?: number): Promise\u003cany\u003e`\n\nReturns a count of bytes in a Buffer, characters in a string, or objects in an array. If no data arrives before the stream ends, `null` is returned.\n\n## iterate\n\n`iterate(stream: Readable): Readable | AsyncIterableIterator\u003cany\u003e`\n\nReturns an async iterator for any stream on node 8+\n\n## promise\n\n`promise(stream: Readable) =\u003e Promise(any)`\n\nAll bluestream streams implement a promise method that returns a promise that is fulfilled at the end of the stream or rejected if any errors are emitted by the stream.\n\nFor `ReduceStreams`, the promise is for the final reduction result. Any stream errors or exceptions encountered while reducing will result in a rejection of the promise.\n\n```ts\nconst { pipe, map, tap, reduce } = require('bluestream')\nconst { Nodes } = require('./util')\n\nlet count = 0\nconst stats = await pipe(\n  Nodes.scan({ fields: true }),\n  map(generateStats),\n  tap(() =\u003e count++),\n  reduce(mergeGraphStats, {})\n)\nconsole.log({ count, stats })\n```\n\n# Credits\n\nMade by the loving team at [@bustle](https://bustle.com/jobs) and maybe [you](https://github.com/bustle/bluestream/compare)?\n","funding_links":[],"categories":["📦 Legacy \u0026 Inactive Projects"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbustle%2Fbluestream","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbustle%2Fbluestream","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbustle%2Fbluestream/lists"}