https://github.com/willfarrell/datastream
Commonly used stream patterns for Web Streams API and NodeJS Streams
https://github.com/willfarrell/datastream
nodejs-stream stream streams web-stream
Last synced: 18 days ago
JSON representation
Commonly used stream patterns for Web Streams API and NodeJS Streams
- Host: GitHub
- URL: https://github.com/willfarrell/datastream
- Owner: willfarrell
- License: mit
- Created: 2022-06-20T04:22:26.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2024-09-25T18:45:02.000Z (over 1 year ago)
- Last Synced: 2025-04-06T13:01:36.552Z (12 months ago)
- Topics: nodejs-stream, stream, streams, web-stream
- Language: JavaScript
- Homepage: https://datastream.js.org
- Size: 3.91 MB
- Stars: 7
- Watchers: 3
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
<datastream>
Commonly used stream patterns for Web Streams API and NodeJS Stream.
If you're iterating over an array more than once, it's time to use streams.
- [`@datastream/core`](#core)
- pipeline
- pipejoin
- streamToArray
- streamToString
- isReadable
- isWritable
- makeOptions
- createReadableStream
- createTransformStream
- createWritableStream
## Streams
- Readable: The start of a pipeline of streams that injects data into a stream.
- PassThrough: Does not modify the data, but listens to the data and prepares a result that can be retrieved.
- Transform: Modifies data as it passes through.
- Writable: The end of a pipeline of streams that stores data from the stream.
### Basics
- [`@datastream/string`](#string)
- stringReadableStream [Readable]
- stringLengthStream [PassThrough]
- stringOutputStream [PassThrough]
- [`@datastream/object`](#object)
- objectReadableStream [Readable]
- objectCountStream [PassThrough]
- objectBatchStream [Transform]
- objectOutputStream [PassThrough]
### Common
- [`@datastream/fetch`](#fetch)
- fetchResponseStream [Readable]
- [`@datastream/charset[/{detect,decode,encode}]`](#charset)
- charsetDetectStream [PassThrough]
- charsetDecodeStream [Transform]
- charsetEncodeStream [Transform]
- [`@datastream/compression[/{gzip,deflate}]`](#compression)
- gzipCompressionStream [Transform]
- gzipDecompressionStream [Transform]
- deflateCompressionStream [Transform]
- deflateDecompressionStream [Transform]
- [`@datastream/digest`](#digest)
- digestStream [PassThrough]
### Advanced
- [`@datastream/csv[/{parse,format}]`](#csv)
- csvParseStream [Transform]
- csvFormatStream [Transform]
- [`@datastream/validate`](#validate)
- validateStream [Transform]
## Setup
```bash
npm install @datastream/core @datastream/{module}
```
## Flows
```mermaid
stateDiagram-v2
[*] --> fileRead*: path
[*] --> fetchResponse: URL
[*] --> sqlCopyTo*: SQL
[*] --> stringReadable: string
[*] --> stringReadable: string[]
[*] --> objectReadable: object[]
[*] --> createReadable: blob
readable --> charsetDetect: binary
charsetDetect --> [*]
readable --> decryption
decryption --> passThroughBuffer: buffer
readable --> decompression
decompression --> passThroughBuffer: buffer
passThroughBuffer --> charsetDecode: buffer
charsetDecode --> passThroughString: string
passThroughString --> parse: string
parse --> validate: object
validate --> passThroughObject: object
passThroughObject --> transform: object
transform --> format: object
format --> charsetEncode: string
charsetEncode --> compression: buffer
compression --> writable: buffer
charsetEncode --> encryption: buffer
encryption --> writable: buffer
state readable {
fileRead*
fetchResponse
sqlCopyTo*
createReadable
stringReadable
objectReadable
awsS3Get
awsDynamoDBQuery
awsDynamoDBScan
awsDynamoDBGet
}
state decompression {
brotliDeompression
gzipDeompression
deflateDeompression
zstdDecompression*
protobufDecompression*
}
state decryption {
decryption*
}
state parse {
csvParse
jsonParse*
xmlParse*
}
state passThroughBuffer {
digest
}
state passThroughString {
stringLength
stringOutput
}
state passThroughObject {
objectCount
objectOutput
}
state transform {
objectBatch
objectPivotLongToWide
objectPivotWideToLong
objectKeyValue
objectKeyValue
}
state format {
csvFormat
jsonFormat*
xmlFormat*
}
state compression {
brotliCompression
gzipCompression
deflateCompression
zstdCompression*
protobufCompression*
}
state encryption {
encryption*
}
state writable {
fileWrite*
fetchRequest*
sqlCopyFrom*
awsS3Put
awsDynamoDBPut
awsDynamoDBDelete
}
writable --> [*]
```
\* possible future package
## Write your own
### Readable
#### NodeJS Streams
- [NodeJS](https://nodejs.org/api/stream.html#class-streamreadable)
#### Web Streams API
- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-readablestream)
### Transform
#### NodeJS Streams
- [NodeJS](https://nodejs.org/api/stream.html#class-streamtransform)
#### Web Streams API
- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-transformstream)
### Writeable
#### NodeJS Streams
- [NodeJS](https://nodejs.org/api/stream.html#class-streamwritable)
#### Web Streams API
- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/WritableStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-writablestream)
## End-to-End Examples
### NodeJS: Import CSV into SQL database
Read a CSV file, validate the structure, pivot data, then save compressed.
- fs.creatReadStream
- gzip
- cryptoDigest
- charsetDecode
- csvParse
- countChunks
- validate
- changeCase (pascal to snake)
- parquet?
- csvFormat
- postgesCopyFrom
### WebWorker: Validate and collect metadata about file prior to upload
-
- cryptoDigest
- charsetDetect
- jsonParse?
- validate
### WebWorker: Upload file compressed
Upload file with brotli compression?
### WebWorker: Decompress protobuf compressed JSON requests
Fetch protobuf file, decompress, parse JSON
### streams
- filter
- file (docs only?)
### examples
- fetch
- node:fs
- input type=file
- readable string/array/etc
## License
Licensed under [MIT License](LICENSE). Copyright (c) 2026 [will Farrell](https://github.com/willfarrell) and [contributors](https://github.com/middyjs/middy/graphs/contributors).