https://github.com/lmstudio-ai/lms
LM Studio CLI
https://github.com/lmstudio-ai/lms
llm lmstudio nodejs typescript
Last synced: about 2 months ago
JSON representation
LM Studio CLI
- Host: GitHub
- URL: https://github.com/lmstudio-ai/lms
- Owner: lmstudio-ai
- License: mit
- Created: 2024-04-15T15:04:41.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2026-01-12T21:44:06.000Z (about 2 months ago)
- Last Synced: 2026-01-12T21:57:10.588Z (about 2 months ago)
- Topics: llm, lmstudio, nodejs, typescript
- Language: TypeScript
- Homepage: https://lms.dev
- Size: 713 KB
- Stars: 4,047
- Watchers: 35
- Forks: 315
- Open Issues: 205
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - lmstudio-ai/lms
- awesome-repositories - lmstudio-ai/lms - LM Studio CLI (TypeScript)
- awesome-production-machine-learning - LM Studio - ai/lms.svg?style=social) - LM Studio is a tool for deploying LLM models locally on the computer, even on a relatively modest machine, provided it meets the minimum requirements. (Deployment and Serving)
README
lms - Command Line Tool for LM Studio
Built with lmstudio.js
# Installation
`lms` ships with [LM Studio](https://lmstudio.ai/) 0.2.22 and newer.
If you have trouble running the command, try running `npx lmstudio install-cli` to add it to path.
To check if the bootstrapping was successful, run the following in a **👉 new terminal window 👈**:
```shell
lms
```
# Usage
You can use `lms --help` to see a list of all available subcommands.
For details about each subcommand, run `lms --help`.
Here are some frequently used commands:
- `lms status` - To check the status of LM Studio.
- `lms server start` - To start the local API server.
- `lms server stop` - To stop the local API server.
- `lms ls` - To list all downloaded models.
- `lms ls --json` - To list all downloaded models in machine-readable JSON format.
- `lms ps` - To list all loaded models available for inferencing.
- `lms ps --json` - To list all loaded models available for inferencing in machine-readable JSON format.
- `lms load` - To load a model
- `lms load -y` - To load a model with maximum GPU acceleration without confirmation
- `lms unload ` - To unload a model
- `lms unload --all` - To unload all models
- `lms create` - To create a new project with LM Studio SDK
- `lms log stream` - To stream logs from LM Studio
# Contributing
The CLI is part of the [lmstudio.js monorepo](https://github.com/lmstudio-ai/lmstudio.js) and cannot be built standalone.
## Building and Testing the CLI
```bash
# Clone and build the entire monorepo
git clone https://github.com/lmstudio-ai/lmstudio-js.git --recursive
cd lmstudio-js
npm install
npm run build
# Test your CLI changes
node publish/cli/dist/index.js
```
**Example:**
```bash
node publish/cli/dist/index.js --help
node publish/cli/dist/index.js status
```
See [CONTRIBUTING.md](CONTRIBUTING.md) for more information.