An open API service indexing awesome lists of open source software.

https://github.com/willfarrell/datastream

Commonly used stream patterns for Web Streams API and NodeJS Streams
https://github.com/willfarrell/datastream

nodejs-stream stream streams web-stream

Last synced: 18 days ago
JSON representation

Commonly used stream patterns for Web Streams API and NodeJS Streams

Awesome Lists containing this project

README

          

<datastream>


Commonly used stream patterns for Web Streams API and NodeJS Stream.


If you're iterating over an array more than once, it's time to use streams.





GitHub Actions unit test status
GitHub Actions dast test status
GitHub Actions perf test status
GitHub Actions SAST test status
GitHub Actions lint test status


npm version
npm install size

npm weekly downloads


npm provenance



Open Source Security Foundation (OpenSSF) Scorecard
SLSA 3

Checked with Biome
Conventional Commits

code coverage



Chat on Gitter
Ask questions on StackOverflow


- [`@datastream/core`](#core)
- pipeline
- pipejoin
- streamToArray
- streamToString
- isReadable
- isWritable
- makeOptions
- createReadableStream
- createTransformStream
- createWritableStream

## Streams

- Readable: The start of a pipeline of streams that injects data into a stream.
- PassThrough: Does not modify the data, but listens to the data and prepares a result that can be retrieved.
- Transform: Modifies data as it passes through.
- Writable: The end of a pipeline of streams that stores data from the stream.

### Basics

- [`@datastream/string`](#string)
- stringReadableStream [Readable]
- stringLengthStream [PassThrough]
- stringOutputStream [PassThrough]
- [`@datastream/object`](#object)
- objectReadableStream [Readable]
- objectCountStream [PassThrough]
- objectBatchStream [Transform]
- objectOutputStream [PassThrough]

### Common

- [`@datastream/fetch`](#fetch)
- fetchResponseStream [Readable]
- [`@datastream/charset[/{detect,decode,encode}]`](#charset)
- charsetDetectStream [PassThrough]
- charsetDecodeStream [Transform]
- charsetEncodeStream [Transform]
- [`@datastream/compression[/{gzip,deflate}]`](#compression)
- gzipCompressionStream [Transform]
- gzipDecompressionStream [Transform]
- deflateCompressionStream [Transform]
- deflateDecompressionStream [Transform]
- [`@datastream/digest`](#digest)
- digestStream [PassThrough]

### Advanced

- [`@datastream/csv[/{parse,format}]`](#csv)
- csvParseStream [Transform]
- csvFormatStream [Transform]
- [`@datastream/validate`](#validate)
- validateStream [Transform]

## Setup

```bash
npm install @datastream/core @datastream/{module}
```

## Flows

```mermaid
stateDiagram-v2

[*] --> fileRead*: path
[*] --> fetchResponse: URL
[*] --> sqlCopyTo*: SQL
[*] --> stringReadable: string
[*] --> stringReadable: string[]
[*] --> objectReadable: object[]
[*] --> createReadable: blob

readable --> charsetDetect: binary
charsetDetect --> [*]

readable --> decryption
decryption --> passThroughBuffer: buffer

readable --> decompression
decompression --> passThroughBuffer: buffer
passThroughBuffer --> charsetDecode: buffer
charsetDecode --> passThroughString: string
passThroughString --> parse: string
parse --> validate: object
validate --> passThroughObject: object
passThroughObject --> transform: object

transform --> format: object
format --> charsetEncode: string
charsetEncode --> compression: buffer
compression --> writable: buffer

charsetEncode --> encryption: buffer
encryption --> writable: buffer

state readable {
fileRead*
fetchResponse
sqlCopyTo*
createReadable
stringReadable
objectReadable
awsS3Get
awsDynamoDBQuery
awsDynamoDBScan
awsDynamoDBGet
}

state decompression {
brotliDeompression
gzipDeompression
deflateDeompression
zstdDecompression*
protobufDecompression*
}

state decryption {
decryption*
}

state parse {
csvParse
jsonParse*
xmlParse*
}

state passThroughBuffer {
digest
}

state passThroughString {
stringLength
stringOutput
}

state passThroughObject {
objectCount
objectOutput
}

state transform {
objectBatch
objectPivotLongToWide
objectPivotWideToLong
objectKeyValue
objectKeyValue
}

state format {
csvFormat
jsonFormat*
xmlFormat*
}

state compression {
brotliCompression
gzipCompression
deflateCompression
zstdCompression*
protobufCompression*
}

state encryption {
encryption*
}

state writable {
fileWrite*
fetchRequest*
sqlCopyFrom*
awsS3Put
awsDynamoDBPut
awsDynamoDBDelete
}
writable --> [*]
```

\* possible future package

## Write your own

### Readable

#### NodeJS Streams

- [NodeJS](https://nodejs.org/api/stream.html#class-streamreadable)

#### Web Streams API

- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-readablestream)

### Transform

#### NodeJS Streams

- [NodeJS](https://nodejs.org/api/stream.html#class-streamtransform)

#### Web Streams API

- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-transformstream)

### Writeable

#### NodeJS Streams

- [NodeJS](https://nodejs.org/api/stream.html#class-streamwritable)

#### Web Streams API

- [MDN](https://developer.mozilla.org/en-US/docs/Web/API/WritableStream)
- [NodeJS](https://nodejs.org/api/webstreams.html#class-writablestream)

## End-to-End Examples

### NodeJS: Import CSV into SQL database

Read a CSV file, validate the structure, pivot data, then save compressed.

- fs.creatReadStream
- gzip
- cryptoDigest
- charsetDecode
- csvParse
- countChunks
- validate
- changeCase (pascal to snake)
- parquet?
- csvFormat
- postgesCopyFrom

### WebWorker: Validate and collect metadata about file prior to upload

-
- cryptoDigest
- charsetDetect
- jsonParse?
- validate

### WebWorker: Upload file compressed

Upload file with brotli compression?

### WebWorker: Decompress protobuf compressed JSON requests

Fetch protobuf file, decompress, parse JSON

### streams

- filter

- file (docs only?)

### examples

- fetch
- node:fs
- input type=file
- readable string/array/etc

## License

Licensed under [MIT License](LICENSE). Copyright (c) 2026 [will Farrell](https://github.com/willfarrell) and [contributors](https://github.com/middyjs/middy/graphs/contributors).