Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/nitedani/rxjs-stream
https://github.com/nitedani/rxjs-stream
Last synced: 29 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/nitedani/rxjs-stream
- Owner: nitedani
- Created: 2021-01-24T15:49:23.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2021-01-25T02:00:19.000Z (almost 4 years ago)
- Last Synced: 2024-10-02T08:55:53.243Z (about 1 month ago)
- Language: JavaScript
- Size: 20.5 KB
- Stars: 4
- Watchers: 1
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Currently this package has one function:
`fromReadStream(readStream:stream.Readable): Observable`
`fromReadStream` creates an rxjs observable from a node readstream with backpressure support.
How it works: the underlying readstream is paused whenever a new chunk is read, and resumed after the value is emitted and processed by the observer.
Example:
```
import { createReadStream } from "fs";
import { fromReadStream } from "@nitedani/rxjs-stream";
import { delay, tap, map } from "rxjs/operators";
const rs = createReadStream("huge_file.txt",{
encoding: "utf-8",
});
fromReadStream(rs)
.pipe(
tap(() => {
console.log("Processing chunk...");
}),
map((chunk) => chunk.toLowerCase()),
delay(1000)
)
.subscribe((chunk) => console.log(chunk));
```
Processing large json array with [stream-json](https://www.npmjs.com/package/stream-json):
```
import { createReadStream } from "fs";
import { fromReadStream } from "@nitedani/rxjs-stream";
import { delay, tap } from "rxjs/operators";
import { withParser } from "stream-json/streamers/StreamArray";
const rs = createReadStream("huge_array.json", {
encoding: "utf-8",
});
const jsonStream = rs.pipe(withParser());
fromReadStream(jsonStream)
.pipe(
tap(() => {
console.log("Processing chunk...");
}),
delay(1000)
)
.subscribe({
next: (chunk) => console.log(chunk),
complete: () => {
console.log("complete");
},
});
```
Processing large csv file with [fast-csv](https://www.npmjs.com/package/fast-csv):
```
import { createReadStream } from "fs";
import { fromReadStream } from "@nitedani/rxjs-stream";
import { delay, tap } from "rxjs/operators";
import { parse } from "fast-csv";
const rs = createReadStream("huge_csv.csv");
const csvStream = rs.pipe(parse({ headers: true }));
fromReadStream(csvStream)
.pipe(
tap(() => {
console.log("Processing chunk...");
}),
delay(1000)
)
.subscribe({
next: (chunk) => console.log(chunk),
complete: () => {
console.log("complete");
},
});
```