Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/planet-a-ventures/avsc-zstandard-codec
Zstandard codec for AVSC (www.npmjs.com/package/avsc)
https://github.com/planet-a-ventures/avsc-zstandard-codec
avcs avro codec compression node npm snowflake zstandard zstd
Last synced: 28 days ago
JSON representation
Zstandard codec for AVSC (www.npmjs.com/package/avsc)
- Host: GitHub
- URL: https://github.com/planet-a-ventures/avsc-zstandard-codec
- Owner: planet-a-ventures
- License: mit
- Created: 2024-09-16T09:48:15.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-10-21T11:12:49.000Z (about 1 month ago)
- Last Synced: 2024-10-21T12:15:40.814Z (about 1 month ago)
- Topics: avcs, avro, codec, compression, node, npm, snowflake, zstandard, zstd
- Language: TypeScript
- Homepage:
- Size: 87.9 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# `Zstandard` codec for `avsc`
[Zstandard](https://github.com/facebook/zstd) codec for [avsc](https://github.com/mtth/avsc).
## How to install
```shell
npm i @planet-a/avsc-zstandard-codec
```or
```shell
yarn add @planet-a/avsc-zstandard-codec
```## Example:
```ts
import Avro from "avsc";
import {
createDecoderMixin,
createEncoderMixin,
codecName,
} from "@planet-a/avsc-zstandard-codec";const mySchema = Avro.Type.forSchema({ type: "string" });
{
// encode
const fileEncoder = Avro.createFileEncoder("./my.avro", mySchema, {
codec: codecName,
codecs: {
...Avro.streams.BlockEncoder.defaultCodecs(),
...createEncoderMixin(),
},
})
.write("Hello")
.write("World")
.end();
await finished(fileEncoder);
}{
// decode
const fileDecoder = Avro.createFileDecoder("./my.avro", {
codecs: {
...Avro.streams.BlockEncoder.defaultCodecs(),
...createDecoderMixin(),
},
}).on("data", console.log.bind(console));
await finished(fileDecoder);
}
```## Why `@mongodb-js/zstd`?
It uses the [@mongodb-js/zstd](https://github.com/mongodb-js/zstd) package, as this package has a few advantages:
- The `decompress` function does not need the uncompressed buffer size in advance, a restriction which most other WASM-based implementations have and renders them unusable for this task
- It works with `Buffer`. Whilst a `Uint8Array` implementation would be more portable (I am looking at you, Deno), `[email protected]` itself is using `Buffer`. https://github.com/mtth/avsc/pull/452 has landed, so we might have some more options of what packages to use once we drop `[email protected]` support.## A note about `Snowflake` compatibility
You'll see that the current implementation uses defaults from the [Avro repository](https://github.com/apache/avro).
Namely:
- the codec name (if you don't adhere to `zstandard` the file won't be readable at all)
- whether to use a checksum or not (with checksum, the metadata will be readable, but the data will yield an error (`Could not read file`)).The reason for that is, that in order to make the Avro export as portable as possible, we need to make sure that none of these things need to be specified. A prime example of that is for example Snowflake's Avro support ([`COPY INTO`](https://docs.snowflake.com/en/sql-reference/sql/copy-into-table)). Specifically, if you alter the codec name and/or the checksum flag, you won't be able to use the generated Avro files via their product.