{"id":26126201,"url":"https://github.com/discoveryjs/json-ext","last_synced_at":"2025-05-14T15:05:50.882Z","repository":{"id":38343846,"uuid":"287218834","full_name":"discoveryjs/json-ext","owner":"discoveryjs","description":"A set of performant and memory efficient utilities that extend the use of JSON","archived":false,"fork":false,"pushed_at":"2025-02-12T22:21:15.000Z","size":11556,"stargazers_count":166,"open_issues_count":2,"forks_count":6,"subscribers_count":4,"default_branch":"master","last_synced_at":"2025-05-09T03:31:21.840Z","etag":null,"topics":["async","json","parse","parser","stream","stringify","stringifystream"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/discoveryjs.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-08-13T08:04:56.000Z","updated_at":"2025-04-24T12:29:12.000Z","dependencies_parsed_at":"2024-06-17T18:29:07.444Z","dependency_job_id":"6d4deb02-1218-4da7-8c67-11fe6cb18982","html_url":"https://github.com/discoveryjs/json-ext","commit_stats":{"total_commits":209,"total_committers":7,"mean_commits":"29.857142857142858","dds":"0.038277511961722466","last_synced_commit":"570860b81f214321eb1860de00db9866510173f0"},"previous_names":[],"tags_count":19,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/discoveryjs%2Fjson-ext","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/discoveryjs%2Fjson-ext/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/discoveryjs%2Fjson-ext/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/discoveryjs%2Fjson-ext/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/discoveryjs","download_url":"https://codeload.github.com/discoveryjs/json-ext/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253948511,"owners_count":21988962,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["async","json","parse","parser","stream","stringify","stringifystream"],"created_at":"2025-03-10T17:28:03.556Z","updated_at":"2025-05-14T15:05:50.853Z","avatar_url":"https://github.com/discoveryjs.png","language":"JavaScript","readme":"# json-ext\n\n[![NPM version](https://img.shields.io/npm/v/@discoveryjs/json-ext.svg)](https://www.npmjs.com/package/@discoveryjs/json-ext)\n[![Build Status](https://github.com/discoveryjs/json-ext/actions/workflows/ci.yml/badge.svg)](https://github.com/discoveryjs/json-ext/actions/workflows/ci.yml)\n[![Coverage Status](https://coveralls.io/repos/github/discoveryjs/json-ext/badge.svg?branch=master)](https://coveralls.io/github/discoveryjs/json-ext)\n[![NPM Downloads](https://img.shields.io/npm/dm/@discoveryjs/json-ext.svg)](https://www.npmjs.com/package/@discoveryjs/json-ext)\n\nA set of utilities designed to extend JSON's capabilities, especially for handling large JSON data (over 100MB) efficiently:\n\n- [parseChunked()](#parsechunked) – Parses JSON incrementally; similar to [`JSON.parse()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse), but processing JSON data in chunks.\n- [stringifyChunked()](#stringifychunked) – Converts JavaScript objects to JSON incrementally; similar to [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify), but returns a generator that yields JSON strings in parts.\n- [stringifyInfo()](#stringifyinfo) – Estimates the size of the `JSON.stringify()` result and identifies circular references without generating the JSON.\n- [parseFromWebStream()](#parsefromwebstream) – A helper function to parse JSON chunks directly from a Web Stream.\n- [createStringifyWebStream()](#createstringifywebstream) – A helper function to generate JSON data as a Web Stream.\n\n### Key Features\n\n- Optimized to handle large JSON data with minimal resource usage (see [benchmarks](./benchmarks/README.md))\n- Works seamlessly with browsers, Node.js, Deno, and Bun\n- Supports both Node.js and Web streams\n- Available in both ESM and CommonJS\n- TypeScript typings included\n- No external dependencies\n- Compact size: 9.4Kb (minified), 3.8Kb (min+gzip)\n\n### Why json-ext?\n\n- **Handles large JSON files**: Overcomes the limitations of V8 for strings larger than ~500MB, enabling the processing of huge JSON data.\n- **Prevents main thread blocking**: Distributes parsing and stringifying over time, ensuring the main thread remains responsive during heavy JSON operations.\n- **Reduces memory usage**: Traditional `JSON.parse()` and `JSON.stringify()` require loading entire data into memory, leading to high memory consumption and increased garbage collection pressure. `parseChunked()` and `stringifyChunked()` process data incrementally, optimizing memory usage.\n- **Size estimation**: `stringifyInfo()` allows estimating the size of resulting JSON before generating it, enabling better decision-making for JSON generation strategies.\n\n## Install\n\n```bash\nnpm install @discoveryjs/json-ext\n```\n\n## API\n\n### parseChunked()\n\nFunctions like [`JSON.parse()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse), iterating over chunks to reconstruct the result object, and returns a [Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).\n\n\u003e Note: `reviver` parameter is not supported yet.\n\n```ts\nfunction parseChunked(input: Iterable\u003cChunk\u003e | AsyncIterable\u003cChunk\u003e): Promise\u003cany\u003e;\nfunction parseChunked(input: () =\u003e (Iterable\u003cChunk\u003e | AsyncIterable\u003cChunk\u003e)): Promise\u003cany\u003e;\n\ntype Chunk = string | Buffer | Uint8Array;\n```\n\n[Benchmark](https://github.com/discoveryjs/json-ext/tree/master/benchmarks#parse-chunked)\n\nUsage:\n\n```js\nimport { parseChunked } from '@discoveryjs/json-ext';\n\nconst data = await parseChunked(chunkEmitter);\n```\n\nParameter `chunkEmitter` can be an iterable or async iterable that iterates over chunks, or a function returning such a value. A chunk can be a `string`, `Uint8Array`, or Node.js `Buffer`.\n\nExamples:\n\n- Generator:\n    ```js\n    parseChunked(function*() {\n        yield '{ \"hello\":';\n        yield Buffer.from(' \"wor'); // Node.js only\n        yield new TextEncoder().encode('ld\" }'); // returns Uint8Array\n    });\n    ```\n- Async generator:\n    ```js\n    parseChunked(async function*() {\n        for await (const chunk of someAsyncSource) {\n            yield chunk;\n        }\n    });\n    ```\n- Array:\n    ```js\n    parseChunked(['{ \"hello\":', ' \"world\"}'])\n    ```\n- Function returning iterable:\n    ```js\n    parseChunked(() =\u003e ['{ \"hello\":', ' \"world\"}'])\n    ```\n- Node.js [`Readable`](https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_readable_streams) stream:\n    ```js\n    import fs from 'node:fs';\n\n    parseChunked(fs.createReadStream('path/to/file.json'))\n    ```\n- Web stream (e.g., using [fetch()](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API)):\n    \u003e Note: Iterability for Web streams was added later in the Web platform, not all environments support it. Consider using `parseFromWebStream()` for broader compatibility.\n    ```js\n    const response = await fetch('https://example.com/data.json');\n    const data = await parseChunked(response.body); // body is ReadableStream\n    ```\n\n### stringifyChunked()\n\nFunctions like [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify), but returns a generator yielding strings instead of a single string.\n\n\u003e Note: Returns `\"null\"` when `JSON.stringify()` returns `undefined` (since a chunk cannot be `undefined`).\n\n```ts\nfunction stringifyChunked(value: any, replacer?: Replacer, space?: Space): Generator\u003cstring, void, unknown\u003e;\nfunction stringifyChunked(value: any, options: StringifyOptions): Generator\u003cstring, void, unknown\u003e;\n\ntype Replacer =\n    | ((this: any, key: string, value: any) =\u003e any)\n    | (string | number)[]\n    | null;\ntype Space = string | number | null;\ntype StringifyOptions = {\n    replacer?: Replacer;\n    space?: Space;\n    highWaterMark?: number;\n};\n```\n\n[Benchmark](https://github.com/discoveryjs/json-ext/tree/master/benchmarks#stream-stringifying)\n\nUsage:\n\n- Getting an array of chunks:\n    ```js\n    const chunks = [...stringifyChunked(data)];\n    ```\n- Iterating over chunks:\n    ```js\n    for (const chunk of stringifyChunked(data)) {\n        console.log(chunk);\n    }\n    ```\n- Specifying the minimum size of a chunk with `highWaterMark` option:\n    ```js\n    const data = [1, \"hello world\", 42];\n\n    console.log([...stringifyChunked(data)]); // default 16kB\n    // ['[1,\"hello world\",42]']\n\n    console.log([...stringifyChunked(data, { highWaterMark: 16 })]);\n    // ['[1,\"hello world\"', ',42]']\n\n    console.log([...stringifyChunked(data, { highWaterMark: 1 })]);\n    // ['[1', ',\"hello world\"', ',42', ']']\n    ```\n- Streaming into a stream with a `Promise` (modern Node.js):\n    ```js\n    import { pipeline } from 'node:stream/promises';\n    import fs from 'node:fs';\n\n    await pipeline(\n        stringifyChunked(data),\n        fs.createWriteStream('path/to/file.json')\n    );\n    ```\n- Wrapping into a `Promise` streaming into a stream (legacy Node.js):\n    ```js\n    import { Readable } from 'node:stream';\n\n    new Promise((resolve, reject) =\u003e {\n        Readable.from(stringifyChunked(data))\n            .on('error', reject)\n            .pipe(stream)\n            .on('error', reject)\n            .on('finish', resolve);\n    });\n    ```\n- Writing into a file synchronously:\n    \u003e Note: Slower than `JSON.stringify()` but uses much less heap space and has no limitation on string length\n    ```js\n    import fs from 'node:fs';\n\n    const fd = fs.openSync('output.json', 'w');\n\n    for (const chunk of stringifyChunked(data)) {\n        fs.writeFileSync(fd, chunk);\n    }\n\n    fs.closeSync(fd);\n    ```\n- Using with fetch (JSON streaming):\n    \u003e Note: This feature has limited support in browsers, see [Streaming requests with the fetch API](https://developer.chrome.com/docs/capabilities/web-apis/fetch-streaming-requests)\n\n    \u003e Note: `ReadableStream.from()` has limited [support in browsers](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream/from_static), use [`createStringifyWebStream()`](#createstringifywebstream) instead.\n    ```js\n    fetch('http://example.com', {\n        method: 'POST',\n        duplex: 'half',\n        body: ReadableStream.from(stringifyChunked(data))\n    });\n    ```\n- Wrapping into `ReadableStream`:\n    \u003e Note: Use `ReadableStream.from()` or [`createStringifyWebStream()`](#createstringifywebstream) when no extra logic is needed\n    ```js\n    new ReadableStream({\n        start() {\n            this.generator = stringifyChunked(data);\n        },\n        pull(controller) {\n            const { value, done } = this.generator.next();\n\n            if (done) {\n                controller.close();\n            } else {\n                controller.enqueue(value);\n            }\n        },\n        cancel() {\n            this.generator = null;\n        }\n    });\n    ```\n\n### stringifyInfo()\n\n```ts\nexport function stringifyInfo(value: any, replacer?: Replacer, space?: Space): StringifyInfoResult;\nexport function stringifyInfo(value: any, options?: StringifyInfoOptions): StringifyInfoResult;\n\ntype StringifyInfoOptions = {\n    replacer?: Replacer;\n    space?: Space;\n    continueOnCircular?: boolean;\n}\ntype StringifyInfoResult = {\n    bytes: number;      // size of JSON in bytes\n    spaceBytes: number; // size of white spaces in bytes (when space option used)\n    circular: object[]; // list of circular references\n};\n```\n\nFunctions like [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify), but returns an object with the expected overall size of the stringify operation and a list of circular references.\n\nExample:\n\n```js\nimport { stringifyInfo } from '@discoveryjs/json-ext';\n\nconsole.log(stringifyInfo({ test: true }, null, 4));\n// {\n//   bytes: 20,     // Buffer.byteLength('{\\n    \"test\": true\\n}')\n//   spaceBytes: 7,\n//   circular: []    \n// }\n```\n\n#### Options\n\n##### continueOnCircular\n\nType: `Boolean`  \nDefault: `false`\n\nDetermines whether to continue collecting info for a value when a circular reference is found. Setting this option to `true` allows finding all circular references.\n\n### parseFromWebStream()\n\nA helper function to consume JSON from a Web Stream. You can use `parseChunked(stream)` instead, but `@@asyncIterator` on `ReadableStream` has limited support in browsers (see [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) compatibility table).\n\n```js\nimport { parseFromWebStream } from '@discoveryjs/json-ext';\n\nconst data = await parseFromWebStream(readableStream);\n// equivalent to (when ReadableStream[@@asyncIterator] is supported):\n// await parseChunked(readableStream);\n```\n\n### createStringifyWebStream()\n\nA helper function to convert `stringifyChunked()` into a `ReadableStream` (Web Stream). You can use `ReadableStream.from()` instead, but this method has limited support in browsers (see [ReadableStream.from()](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream/from_static) compatibility table).\n\n```js\nimport { createStringifyWebStream } from '@discoveryjs/json-ext';\n\ncreateStringifyWebStream({ test: true });\n// equivalent to (when ReadableStream.from() is supported):\n// ReadableStream.from(stringifyChunked({ test: true }))\n```\n\n## License\n\nMIT\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdiscoveryjs%2Fjson-ext","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdiscoveryjs%2Fjson-ext","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdiscoveryjs%2Fjson-ext/lists"}