https://github.com/theproductiveprogrammer/analyze-large-json
https://github.com/theproductiveprogrammer/analyze-large-json
Last synced: 3 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/theproductiveprogrammer/analyze-large-json
- Owner: theproductiveprogrammer
- Created: 2023-07-30T23:26:47.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2023-07-30T23:30:44.000Z (over 2 years ago)
- Last Synced: 2025-08-09T01:27:16.946Z (3 months ago)
- Language: JavaScript
- Size: 10.7 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Analyse Large JSON Dumps
This is a small utility that helps analyse large JSON dumps that cannot be loaded into most text editors or using
simpler API's (like `var myjson = require('large-file.json').
It also handles zipped files since large JSON files can be zipped.
## Motivation
I wanted to analyse a large MySQL export (in JSON) format. The file size was 35 GB and it was quite difficult getting
any tool to give me effective results quickly so I wrote this one.
## Usage:
The library takes in a large JSON file (in txt or zipped format) and returns you the records one-by-one in a callback.
The last call to the callback will be empty signalling that all records have been read. Return a non-zero value from
the callback if you want the utility to stop processing anytime in between.
```js
const alj = require('analyse-large-json');
let count = 0;
alj.loadRecords('large-json-file.zip', (err, rec) => {
if(err) {
console.error(err);
return -1; /* stop */
}
if(!rec) {
console.log(`Done! Number of records: ${count}`);
} else {
console.log(rec);
count++;
}
});
```