https://github.com/thenriquedb/reading-large-files-with-node-streams
Read and write large CSV files efficiently using Node.js streams.
https://github.com/thenriquedb/reading-large-files-with-node-streams
csv-parse node node-streams nodejs sqlite
Last synced: 7 months ago
JSON representation
Read and write large CSV files efficiently using Node.js streams.
- Host: GitHub
- URL: https://github.com/thenriquedb/reading-large-files-with-node-streams
- Owner: thenriquedb
- Created: 2024-12-11T22:28:18.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-12-14T16:00:13.000Z (about 1 year ago)
- Last Synced: 2025-04-03T11:45:41.741Z (10 months ago)
- Topics: csv-parse, node, node-streams, nodejs, sqlite
- Language: JavaScript
- Homepage:
- Size: 9.77 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# CSV Read/Write with Node.js Streams and SQLite
This is a proof of concept (POC) demonstrating how to read and write large CSV files efficiently using Node.js streams. The data read from the CSV is also inserted into an SQLite database concurrently, making it suitable for handling large datasets without consuming excessive memory.
## Features
- **Efficiently read large CSV files** using Node.js streams.
- **Write data to new CSV files** with minimal memory usage.
- **Insert CSV data into an SQLite database** as it is read, line by line.
- Suitable for processing large datasets without loading the entire file into memory.
## Install
1. Install dependecies
```bash
yarn
```
2. Generate CSV file.
```bash
# If the number of records is not passeddd, the default value will be 50
yarn seed
```
3. Run aplication
```bash
yarn migrate
```