{"id":13526596,"url":"https://github.com/lewisdiamond/stromjs","last_synced_at":"2025-09-12T02:39:05.254Z","repository":{"id":43079718,"uuid":"336427295","full_name":"lewisdiamond/stromjs","owner":"lewisdiamond","description":"Dependency-free stream utils for Node.js","archived":false,"fork":false,"pushed_at":"2023-02-05T15:53:44.000Z","size":505,"stargazers_count":102,"open_issues_count":1,"forks_count":3,"subscribers_count":4,"default_branch":"master","last_synced_at":"2025-08-27T03:37:03.705Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lewisdiamond.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2021-02-06T01:10:09.000Z","updated_at":"2025-04-01T02:25:48.000Z","dependencies_parsed_at":"2023-02-19T00:00:55.629Z","dependency_job_id":null,"html_url":"https://github.com/lewisdiamond/stromjs","commit_stats":{"total_commits":164,"total_committers":7,"mean_commits":"23.428571428571427","dds":0.5,"last_synced_commit":"bdb187b0ec97363cb275241739e550807edd1385"},"previous_names":[],"tags_count":3,"template":false,"template_full_name":null,"purl":"pkg:github/lewisdiamond/stromjs","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewisdiamond%2Fstromjs","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewisdiamond%2Fstromjs/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewisdiamond%2Fstromjs/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewisdiamond%2Fstromjs/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lewisdiamond","download_url":"https://codeload.github.com/lewisdiamond/stromjs/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewisdiamond%2Fstromjs/sbom","scorecard":{"id":586428,"data":{"date":"2025-08-11","repo":{"name":"github.com/lewisdiamond/stromjs","commit":"bdb187b0ec97363cb275241739e550807edd1385"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":3.2,"checks":[{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Code-Review","score":0,"reason":"Found 1/29 approved changesets -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/node.js.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Pinned-Dependencies","score":3,"reason":"dependency not pinned by hash detected -- score normalized to 3","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:20: update your workflow using https://app.stepsecurity.io/secureworkflow/lewisdiamond/stromjs/node.js.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:22: update your workflow using https://app.stepsecurity.io/secureworkflow/lewisdiamond/stromjs/node.js.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:33: update your workflow using https://app.stepsecurity.io/secureworkflow/lewisdiamond/stromjs/node.js.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:34: update your workflow using https://app.stepsecurity.io/secureworkflow/lewisdiamond/stromjs/node.js.yml/master?enable=pin","Info:   0 out of   4 GitHub-owned GitHubAction dependencies pinned","Info:   2 out of   2 npmCommand dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"License","score":9,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Warn: project license file does not contain an FSF or OSI license."],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":-1,"reason":"internal error: error during branchesHandler.setup: internal error: githubv4.Query: Resource not accessible by integration","details":null,"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 4 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":3,"reason":"7 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-v6h2-p8h4-qcjw","Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg","Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275","Warn: Project is vulnerable to: GHSA-4q6p-r6v2-jvc5","Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv","Warn: Project is vulnerable to: GHSA-9wv6-86v2-598j","Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-20T20:37:58.665Z","repository_id":43079718,"created_at":"2025-08-20T20:37:58.665Z","updated_at":"2025-08-20T20:37:58.665Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":274744062,"owners_count":25341136,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-12T02:00:09.324Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T06:01:31.955Z","updated_at":"2025-09-12T02:39:05.235Z","avatar_url":"https://github.com/lewisdiamond.png","language":"TypeScript","readme":"![image](https://img.shields.io/npm/v/stromjs) ![image](https://img.shields.io/npm/dw/stromjs) ![image](https://github.com/lewisdiamond/stromjs/actions/workflows/node.js.yml/badge.svg)\n\n# Strom\n\n**Dependency-free stream utils for Node.js**\n\n\u003csub\u003eReleased under the [MIT](LICENSE) license.\u003c/sub\u003e\n\n## Installation\n\n```sh\nyarn add stromjs\n```\n```sh\nnpm add stromjs\n```\n\n## Usage\n\n### CommonJS\n```js\nconst strom = require(\"stromjs\")\nprocess.stdin.pipe(strom.map(...))\n```\n```js\nconst {map} = require(\"stromjs\")\nprocess.stdin.pipe(map(...))\n```\n\n### Module\n```js\nimport strom from \"stromjs\"\nprocess.stdin.pipe(strom.map(...))\n```\n```js\nimport {map} from \"stromjs\"\nprocess.stdin.pipe(map(...))\n```\n\n### Override default options\n```js\nimport {instance} from \"stromjs\"\nconst strom = instance({objectMode: false})\n```\nor\n```js\nconst strom = require(\"stromjs\").instance({objectMode: false})\n```\n\nSee [instance(defaultOptions)](#instancedefaultOptions) for details.\n\n## API\n\n- [accumulator(flushStrategy, iteratee, options)](#accumulatorflushStrategy-iteratee-options)\n- [batch(batchSize, maxBatchAge, options)](#batchbatchSize-maxBatchAge-options)\n- [child(childProcess)](#childchildProcess)\n- [collect(options)](#collectoptions)\n- [compose(streams, errorCb, options)](#composestreams-errorCb-options)\n- [concat(streams)](#concatstreams)\n- [demux(pipelineConstructor, demuxBy, options)](#demuxpipelineConstructor-demuxBy-options)\n- [duplex(writable, readable)](#duplexwritable-readable)\n- [filter(predicate, options)](#filterpredicate-options)\n- [flatMap(mapper, options)](#flatMapmapper-options)\n- [fromArray(array)](#fromArrayarray)\n- [instance(defaultOptions)](#instancedefaultOptions)\n- [join(separator)](#joinseparator)\n- [last(readable)](#lastreadable)\n- [map(mapper, options)](#mapmapper-options)\n- [merge(streams)](#mergestreams)\n- [parallelMap(mapper, parallel, sleepTime, options)](#parallelMapmapper-parallel-sleepTime-options)\n- [parse()](#parse)\n- [rate()](#rate())\n- [reduce(iteratee, initialValue, options)](#reduceiteratee-initialValue,-options)\n- [replace(searchValue, replaceValue)](#replacesearchValue-replaceValue)\n- [split(separator)](#splitseparator)\n- [stringify()](#stringify)\n\n## batch(batchSize, maxBatchAge, options)\nReturns a `Transform` stream which produces all incoming data in batches of size `batchSize`.\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `batchSize` | `number` | Size of the batches to be produced |\n| `maxBatchAge` | `number`  | Maximum number of milliseconds a message will be queued for. E.g. a batch will be produced before reaching `batchSize` if the first message queued is `maxBatchAge` ms old or more |\n| `options` | `TransformOptions` | Options passed down to the Transform object |\n\n```js\nstrom.fromArray([\"a\", \"b\", \"c\", \"d\"])\n    .pipe(strom.batch(3, 500))\n    .pipe(process.stdout);\n// [\"a\",\"b\",\"c\"]\n// [\"d\"] //After 500ms\n```\n\n## child(childProcess)\nReturns a `Duplex` stream from a child process' stdin and stdout\n\n| Param | Type | Description |\n| --- | --- | --- |\n| childProcess | `ChildProcess` | Child process from which to create duplex stream |\n\n```js\nconst catProcess = require(\"child_process\").exec(\"grep -o ab\");\nstrom.fromArray([\"a\", \"b\", \"c\"])\n    .pipe(strom.child(catProcess))\n    .pipe(process.stdout);\n// ab is printed out\n```\n\n## collect(options)\nReturns a `ReadWrite` stream that collects streamed chunks into an array or buffer\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `options` | `object`  |  |\n| `options.objectMode` | `boolean` | Whether this stream should behave as a stream of objects |\n\n```js\nstrom.fromArray([\"a\", \"b\", \"c\"])\n    .pipe(strom.collect({ objectMode: true }))\n    .once(\"data\", object =\u003e console.log(object));\n// [ 'a', 'b', 'c' ] is printed out\n```\n\n## compose(streams, errorCb, options)\n\nReturns a `Transform` stream which consists of all `streams` but behaves as a single stream. The returned stream can be piped into and from transparently.\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `streams` | `Array` | Streams to be composed |\n| `errorCb` | `(err: Error) =\u003e void`  | Function called when an error occurs in any of the streams |\n| `options` | `TransformOptions` | Options passed down to the Transform object |\n\n```js\nconst composed = strom.compose([\n    strom.split(),\n    strom.map(data =\u003e data.trim()),\n    strom.filter(str =\u003e !!str),\n    strom.parse(),\n    strom.flatMap(data =\u003e data),\n    strom.stringify(),\n]);\n\nconst data = [\"[1,2,3] \\n  [4,5,6] \", \"\\n [7,8,9] \\n\\n\"];\n\nstrom.fromArray(data).pipe(composed).pipe(process.stdout);\n// 123456789\n```\n\n## concat(streams)\nReturns a `Readable` stream of readable streams concatenated together\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `streams` | `...Readable[]` | Readable streams to concatenate |\n\n```js\nconst source1 = new Readable();\nconst source2 = new Readable();\nstrom.concat(source1, source2).pipe(process.stdout)\nsource1.push(\"a1 \");\nsource2.push(\"c3 \");\nsource1.push(\"b2 \");\nsource2.push(\"d4 \");\nsource1.push(null);\nsource2.push(null);\n// a1 b2 c3 d4 is printed out\n```\n\n## duplex(writable, readable)\nReturns a `Duplex` stream from a writable stream that is assumed to somehow, when written to,\ncause the given readable stream to yield chunks\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `writable` | `Writable` | Writable stream assumed to cause the readable stream to yield chunks when written to |\n| `readable` | `Readable` | Readable stream assumed to yield chunks when the writable stream is written to |\n\n```js\nconst catProcess = require(\"child_process\").exec(\"grep -o ab\");\nstrom.fromArray([\"a\", \"b\", \"c\"])\n    .pipe(strom.duplex(catProcess.stdin, catProcess.stdout))\n    .pipe(process.stdout);\n// ab is printed out\n```\n\n## filter(predicate, options)\nReturns a `ReadWrite` stream that filters out streamed chunks for which the predicate does not hold\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `predicate` | `(chunk: T, encoding: string) =\u003e boolean` | Predicate with which to filter scream chunks |\n| `options` | `object`  |  |\n| `options.objectMode` | `boolean` | `boolean` | Whether this stream should behave as a stream of objects |\n\n```js\nstrom.fromArray([\"a\", \"b\", \"c\"])\n    .pipe(strom.filter(s =\u003e s !== \"b\"))\n    .pipe(process.stdout);\n// ac is printed out\n```\n\n## flatMap(mapper, options)\nReturns a `ReadWrite` stream that flat maps streamed chunks\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `mapper` | `(chunk: T, encoding: string) =\u003e R[]` | Mapper function, mapping each (chunk, encoding) to an array of new chunks (or a promise of such) |\n| `options` | `object`  |  |\n| `options.readableObjectMode` | `boolean` | Whether this stream should behave as a readable stream of objects |\n| `options.writableObjectMode` | `boolean` | Whether this stream should behave as a writable stream of objects |\n\n```js\nstrom.fromArray([\"a\", \"AA\"])\n    .pipe(strom.flatMap(s =\u003e new Array(s.length).fill(s)))\n    .pipe(process.stdout);\n// aAAAA is printed out\n```\n\n## fromArray(array)\nConvert an array into a `Readable` stream of its elements\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `array` | `T[]` | Array of elements to stream |\n\n```js\nstrom.fromArray([\"a\", \"b\"])\n    .pipe(process.stdout);\n// ab is printed out\n```\n\n## instance(defaultOptions)\nCreates a stromjs instance that uses the provided default options for any created stream function\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `defaultOptions` | `TransformOptions` | Default TransformOptions to apply to created stream functions |\n\n```js\nconst strom = require(\"stromjs\").instance({objectMode: false})\n```\n\n## join(separator)\nReturns a `ReadWrite` stream that joins streamed chunks using the given separator\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `separator` | `string` | Separator to join with |\n| `options` | `object` | |\n| `options.encoding` | `string` | Character encoding to use for decoding chunks. Defaults to utf8\n\n```js\nstrom.fromArray([\"a\", \"b\", \"c\"])\n    .pipe(strom.join(\",\"))\n    .pipe(process.stdout);\n// a,b,c is printed out\n```\n\n## last(readable)\nReturns a `Promise` resolving to the last streamed chunk of the given readable stream, after it has\nended\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `readable` | `Readable` | Readable stream to wait on |\n\n```js\nlet f = async () =\u003e {\n    const source = strom.fromArray([\"a\", \"b\", \"c\"]);\n    console.log(await strom.last(source));\n};\nf();\n// c is printed out\n```\n\n## map(mapper, options)\nReturns a `ReadWrite` stream that maps streamed chunks\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `mapper` | `(chunk: T, encoding: string) =\u003e R` | Mapper function, mapping each (chunk, encoding) to a new chunk (or a promise of such) |\n| `options` | `object`  |  |\n| `options.readableObjectMode` | `boolean` | Whether this stream should behave as a readable stream of objects |\n| `options.writableObjectMode` | `boolean` | Whether this stream should behave as a writable stream of objects |\n\n```js\nstrom.fromArray([\"a\", \"b\"])\n    .pipe(strom.map(s =\u003e s.toUpperCase()))\n    .pipe(process.stdout);\n// AB is printed out\n```\n\n## merge(streams)\nReturns a `Readable` stream of readable streams merged together in chunk arrival order\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `streams` | `...Readable[]` | Readable streams to merge |\n\n```js\nconst source1 = new Readable({ read() {} });\nconst source2 = new Readable({ read() {} });\nstrom.merge(source1, source2).pipe(process.stdout);\nsource1.push(\"a1 \");\nsetTimeout(() =\u003e source2.push(\"c3 \"), 10);\nsetTimeout(() =\u003e source1.push(\"b2 \"), 20);\nsetTimeout(() =\u003e source2.push(\"d4 \"), 30);\nsetTimeout(() =\u003e source1.push(null), 40);\nsetTimeout(() =\u003e source2.push(null), 50);\n// a1 c3 b2 d4 is printed out\n```\n\n## parallelMap(mapper, parallel, sleepTime, options)\nReturns a `Transform` stream which maps incoming data through the async mapper with the given parallelism.\n\n| Param | Type | Description | Default |\n| --- | --- | --- | --- |\n| `mapper` | `async (chunk: T, encoding: string) =\u003e R` | Mapper function, mapping each (chunk, encoding) to a new chunk (non-async will not be parallelized) | -- |\n| `parallel` | `number`  | Number of concurrent executions of the mapper allowed | 10 |\n| `sleepTime` | `number` | Number of milliseconds to wait before testing if more messages can be processed | 1 |\n\n```js\nfunction sleep(time) {\n    return time \u003e 0 ? new Promise(resolve =\u003e setTimeout(resolve, time)) : null;\n}\n\nstrom\n    .fromArray([1, 2, 3, 4, 6, 8])\n    .pipe(\n        strom.parallelMap(async d =\u003e {\n            await sleep(10000 - d * 1000);\n            return `${d}`;\n        }, 3),\n    )\n    .pipe(process.stdout);\n\n// 321864\n```\n\n## parse()\nReturns a `ReadWrite` stream that parses the streamed chunks as JSON\n\n```js\nstrom.fromArray(['{ \"a\": \"b\" }'])\n    .pipe(strom.parse())\n    .once(\"data\", object =\u003e console.log(object));\n// { a: 'b' } is printed out\n```\n\n## reduce(iteratee, initialValue, options)\nReturns a `ReadWrite` stream that reduces streamed chunks down to a single value and yield that\nvalue\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `iteratee` | `(chunk: T, encoding: string) =\u003e R` | Reducer function to apply on each streamed chunk |\n| `initialValue` | `T` | Initial value |\n| `options` | `object`  |  |\n| `options.readableObjectMode` | `boolean` | Whether this stream should behave as a readable stream of objects |\n| `options.writableObjectMode` | `boolean` | Whether this stream should behave as a writable stream of objects |\n\n```js\nstrom.fromArray([\"a\", \"b\", \"cc\"])\n    .pipe(strom.reduce((acc, s) =\u003e ({ ...acc, [s]: s.length }), {}))\n    .pipe(strom.stringify())\n    .pipe(process.stdout);\n// {\"a\":1,\"b\":1\",\"c\":2} is printed out\n```\n\n## replace(searchValue, replaceValue)\nReturns a `ReadWrite` stream that replaces occurrences of the given string or regular expression  in\nthe streamed chunks with the specified replacement string\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `searchValue` |  `string \\| RegExp` | Search string to use |\n| `replaceValue` | `string` | Replacement string to use |\n| `options` | `object` | |\n| `options.encoding` | `string` | Character encoding to use for decoding chunks. Defaults to utf8\n\n```js\nstrom.fromArray([\"a1\", \"b22\", \"c333\"])\n    .pipe(strom.replace(/b\\d+/, \"B\"))\n    .pipe(process.stdout);\n// a1Bc333 is printed out\n```\n\n## split(separator)\nReturns a `ReadWrite` stream that splits streamed chunks using the given separator\n\n| Param | Type | Description |\n| --- | --- | --- |\n| `separator` | `string` | Separator to split by, defaulting to `\"\\n\"` |\n| `options` | `object` | |\n| `options.encoding` | `string` | Character encoding to use for decoding chunks. Defaults to utf8\n\n```js\nstrom.fromArray([\"a,b\", \"c,d\"])\n    .pipe(strom.split(\",\"))\n    .pipe(strom.join(\"|\"))\n    .pipe(process.stdout);\n// a|bc|d is printed out\n```\n\n## stringify()\nReturns a `ReadWrite` stream that stringifies the streamed chunks to JSON\n\n```js\nstrom.fromArray([{ a: \"b\" }])\n    .pipe(strom.stringify())\n    .pipe(process.stdout);\n// {\"a\":\"b\"} is printed out\n```\n\n## accumulator(flushStrategy, iteratee, options)\nTO BE DOCUMENTED\n\n\n## demux(pipelineConstructor, demuxBy, options)\nTO BE DOCUMENTED\n\n\n## rate()\nTO BE DOCUMENTED\n\n```js\nconst strom = require(\"stromjs\").strom();\n\nfunction sleep(time) {\n    return time \u003e 0 ? new Promise(resolve =\u003e setTimeout(resolve, time)) : null;\n}\n\nconst rate = strom.rate(2, 1, { behavior: 1 });\nrate.pipe(strom.map(x =\u003e console.log(x)));\nasync function produce() {\n    rate.write(1);\n    await sleep(500);\n    rate.write(2);\n    await sleep(500);\n    rate.write(3);\n    rate.write(4);\n    rate.write(5);\n    await sleep(500);\n    rate.write(6);\n}\n\nproduce();\n```\n","funding_links":[],"categories":["Repository","TypeScript","Modules"],"sub_categories":["Streams"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flewisdiamond%2Fstromjs","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flewisdiamond%2Fstromjs","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flewisdiamond%2Fstromjs/lists"}