{"id":15547321,"url":"https://github.com/uhop/stream-csv-as-json","last_synced_at":"2025-06-11T02:06:17.726Z","repository":{"id":57371983,"uuid":"137971544","full_name":"uhop/stream-csv-as-json","owner":"uhop","description":"Micro-library of Node stream components with minimal dependencies for creating custom data processors oriented on processing huge CSV files while requiring a minimal memory footprint.","archived":false,"fork":false,"pushed_at":"2024-11-05T06:34:19.000Z","size":957,"stargazers_count":19,"open_issues_count":1,"forks_count":2,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-06-06T19:15:09.462Z","etag":null,"topics":["csv","csv-converter","csv-export","csv-import","csv-parser","csv-reader","stream-csv","stream-json"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/uhop.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null},"funding":{"github":"uhop","buy_me_a_coffee":"uhop"}},"created_at":"2018-06-20T02:47:21.000Z","updated_at":"2024-11-06T20:32:53.000Z","dependencies_parsed_at":"2025-04-14T06:13:11.926Z","dependency_job_id":"19502ade-f639-4156-8fb9-ec36749d49d9","html_url":"https://github.com/uhop/stream-csv-as-json","commit_stats":{"total_commits":39,"total_committers":1,"mean_commits":39.0,"dds":0.0,"last_synced_commit":"e627b17716f363fef56f58c1289eb0e9f58dc674"},"previous_names":[],"tags_count":6,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/uhop%2Fstream-csv-as-json","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/uhop%2Fstream-csv-as-json/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/uhop%2Fstream-csv-as-json/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/uhop%2Fstream-csv-as-json/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/uhop","download_url":"https://codeload.github.com/uhop/stream-csv-as-json/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/uhop%2Fstream-csv-as-json/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":259184740,"owners_count":22818267,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["csv","csv-converter","csv-export","csv-import","csv-parser","csv-reader","stream-csv","stream-json"],"created_at":"2024-10-02T13:08:39.381Z","updated_at":"2025-06-11T02:06:17.688Z","avatar_url":"https://github.com/uhop.png","language":"JavaScript","readme":"# stream-csv-as-json [![NPM version][npm-img]][npm-url]\n\n[npm-img]: https://img.shields.io/npm/v/stream-csv-as-json.svg\n[npm-url]: https://npmjs.org/package/stream-csv-as-json\n\n`stream-csv-as-json` is a micro-library of node.js stream components with minimal dependencies for creating custom data processors oriented on processing huge CSV files while requiring a minimal memory footprint. It can parse CSV files far exceeding available memory. Even individual primitive data items can be streamed piece-wise. Streaming SAX-inspired event-based API is included as well.\n\n`stream-csv-as-json` is a companion project for [stream-json](https://www.npmjs.com/package/stream-json) and it is meant to be used with its filters, streamers and general infrastructure.\n\nAvailable components:\n\n* Streaming JSON [Parser](https://github.com/uhop/stream-csv-as-json/wiki/Parser).\n  * It produces a SAX-like token stream.\n  * Optionally it can pack individual values.\n  * The [main module](https://github.com/uhop/stream-csv-as-json/wiki/Main-module) provides helpers to create a parser.\n* Essentials:\n  * [AsObjects](https://github.com/uhop/stream-csv-as-json/wiki/AsObjects) uses the first row as a list of field names and produces rows as shallow objects with named fields.\n  * [Stringer](https://github.com/uhop/stream-csv-as-json/wiki/Stringer) converts a token stream back into a JSON text stream.\n\nAll components are meant to be building blocks to create flexible custom data processing pipelines. They can be extended and/or combined with custom code. They can be used together with [stream-chain](https://www.npmjs.com/package/stream-chain) and [stream-json](https://www.npmjs.com/package/stream-json) to simplify data processing.\n\nThis toolkit is distributed under New BSD license.\n\n## Introduction\n\n```js\nconst {chain}  = require('stream-chain');\n\nconst {parser} = require('stream-csv-as-json');\nconst {asObjects} = require('stream-csv-as-json/AsObjects');\nconst {StreamValues} = require('stream-json/streamers/StreamValues');\n\nconst fs   = require('fs');\nconst zlib = require('zlib');\n\nconst pipeline = chain([\n  fs.createReadStream('sample.csv.gz'),\n  zlib.createGunzip(),\n  parser(),\n  asObjects(),\n  streamValues(),\n  data =\u003e {\n    const value = data.value;\n    return value \u0026\u0026 value.department === 'accounting' ? data : null;\n  }\n]);\n\nlet counter = 0;\npipeline.on('data', () =\u003e ++counter);\npipeline.on('end', () =\u003e\n  console.log(`The accounting department has ${counter} employees.`));\n```\n\nSee the full documentation in [Wiki](https://github.com/uhop/stream-csv-as-json/wiki).\n\n## Installation\n\n```bash\nnpm install --save stream-csv-as-json\n# or:\nyarn add stream-csv-as-json\n```\n\n## Use\n\nThe whole library is organized as a set of small components, which can be combined to produce the most effective pipeline. All components are based on node.js [streams](http://nodejs.org/api/stream.html), and [events](http://nodejs.org/api/events.html). They implement all required standard APIs. It is easy to add your own components to solve your unique tasks.\n\nThe code of all components is compact and simple. Please take a look at their source code to see how things are implemented, so you can produce your own components in no time.\n\nObviously, if a bug is found, or a way to simplify existing components, or new generic components are created, which can be reused in a variety of projects, don't hesitate to open a ticket, and/or create a pull request.\n\n## License\n\nBSD-3-Clause\n\n## Release History\n\n- 1.0.5 *technical release: updated deps.*\n- 1.0.4 *technical release: updated deps.*\n- 1.0.3 *technical release: updated deps.*\n- 1.0.2 *technical release: updated deps, updated license's year.*\n- 1.0.1 *minor readme tweaks, added TypeScript typings and the badge.*\n- 1.0.0 *the first 1.0 release.*\n","funding_links":["https://github.com/sponsors/uhop","https://buymeacoffee.com/uhop"],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fuhop%2Fstream-csv-as-json","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fuhop%2Fstream-csv-as-json","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fuhop%2Fstream-csv-as-json/lists"}