Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bkonkle/rust-demo-node-modules
High-Performance Node Modules with Rust
https://github.com/bkonkle/rust-demo-node-modules
Last synced: about 1 month ago
JSON representation
High-Performance Node Modules with Rust
- Host: GitHub
- URL: https://github.com/bkonkle/rust-demo-node-modules
- Owner: bkonkle
- Created: 2024-09-12T21:51:33.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-10-17T20:30:13.000Z (3 months ago)
- Last Synced: 2024-11-29T11:10:51.626Z (about 1 month ago)
- Language: TypeScript
- Size: 384 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# High-Performance Node Modules with Rust
WIP
```sh
npm i
```## Pre-commit
```sh
sudo apt install pre-commit
# or...
yay -S pre-commit
# or...
pipx install pre-commit
# or...
brew install pre-commitpre-commit install
```## Run on Local Machine
First, issue yourself a token:
```sh
npm run token
```Then, run the local dev server:
```sh
USE_RUST_MODULE=false npm run dev
```Now, make a `GET /me` call with your access token in the `Authorization` header with a "Bearer" prefix.
You should get back a 200 response, indicating that the JWT token was validated.
## Set up Torch for AI Querying
Install `libtorch` with the cxx11 ABI for use within the Rust module, and place it in `~/lib/libtorch`:
```sh
export VERSION=2.4.1
export CUDA_VERSION=cu124wget https://download.pytorch.org/libtorch/$CUDA_VERSION/libtorch-cxx11-abi-shared-with-deps-$VERSION%2B$CUDA_VERSION.zip
unzip libtorch-cxx11-abi-shared-with-deps-$VERSION+$CUDA_VERSION.zip
rm libtorch-cxx11-abi-shared-with-deps-$VERSION+$CUDA_VERSION.zipmv libtorch ~/lib
```Then export some variables in your `.envrc` to point to it:
```sh
export LIBTORCH=~/lib/libtorch
export LD_LIBRARY_PATH=${LIBTORCH}/lib:$LD_LIBRARY_PATHexport LIBTORCH_BYPASS_VERSION_CHECK=1
```The `LIBTORCH_BYPASS_VERSION_CHECK` allows patch version `2.4.1` to be used where `2.4.0` is expected.
## Compile the Rust modules
```sh
npm run build.jwt-rsanpm run build.text-classification
```This should generate a `lib/jwt_rsa/jwt-rsa...node` file, with an `index.js` and an `index.d.ts` alongside it, and a similar file for the text-classification project. This is how you import the Rust modules into your Node.js application.
Run the local dev server with the rust module enabled:
```sh
USE_RUST_MODULE=true npm run dev
```## Fine-Tune the Model
Before you can query it, you need to take the base Bert uncased model and fine-tune it with the Snips dataset.
You'll need to create and activate a virtualenv for dependencies like "torch" to be available. With Poetry:
```sh
npm run poetry installsource $(npm run --silent poetry-path)/bin/activate
```Then, run the fine-tuning script:
```sh
npm run train-text-classification -- --num-epochs 2
```To convert the resulting model for usage with Rust, it's easiest to [use the `rust-bert` library directly](https://github.com/guillaume-be/rust-bert/tree/main?tab=readme-ov-file#loading-pretrained-and-custom-model-weights):
```sh
# Activate the virtual environment so that Torch is available
source $(npm run --silent poetry-path)/bin/activate# Then switch to the path you've cloned the rust-bert library to
cd /path/to/rust-bertpython utils/convert_model.py path/to/data/snips-bert/pytorch_model.bin
```You should see a `rust_model.ot` file in the `data/snips-bert` directory afterwards. This will be what the Rust process uses.
## Run Benchmarks
Issue a token:
```sh
rpm run token
```Assign the access token to the `ACCESS_TOKEN` environment variable:
```sh
export AUTH_TOKEN="Bearer {access_token}"
```Run the benchmarks:
```sh
npm run bench
```