https://github.com/automata/llmos
LLM OS
https://github.com/automata/llmos
Last synced: 3 months ago
JSON representation
LLM OS
- Host: GitHub
- URL: https://github.com/automata/llmos
- Owner: automata
- License: mit
- Created: 2023-11-05T16:51:54.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-11-11T13:57:51.000Z (about 2 years ago)
- Last Synced: 2025-06-02T18:35:10.287Z (7 months ago)
- Language: Python
- Size: 7.81 KB
- Stars: 2
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LLM OS
The idea is to explore LLMs as the kernel process of an OS,
inspired by [Karpathy](https://x.com/karpathy/status/1707437820045062561?s=20)
as part of [Backdrop Build](https://backdropbuild.com/).
```
User, Other LLMs, LLMOS itself, LLM Apps: Samantha (tm) ;-)
-----------
| User Mode |
-----------
|
------------
| LLM OS |
------------
|
.-------------+------+------.-----------.
| | | |
----- ----- ---- ----
| CPU | | MEM | | IO | | FS |
----- ----- ---- ----
(Scheduler, (Embedding) (Browser, (Embedding)
Python exec) Context Web APIs, Vector db (long term), MemGPT?,
Other LLMs, Files, ...
...
```
LLM vs OS:
- I/O: Multimodal input and output after vectorizing/tokenizing
- IPC / Networking: Plugins to handle internet APIs and browser access
- CPU / Runtime exec / Scheduler: Eval of code in any prog. language (sandboxed), at any time. Smarter scheduler.
- File System: Plugins to handle local FS
- Physical/Virtual Memory: Vector based memory (memGPT)
New possibilities to explore:
- Non-deterministic / flexibility
- AutoGPT-ish mechanism to extend it's own implementation
What the v0 could be?
- Put all those pieces together and solve a toy problem
- Build a simple "LLM App" on top of it, assistant-like, Sam (short for Samantha) :-)
# Using
```
(venv) % python llmos.py
< LLMOS Welcome! Waiting USER input...
> Calculate 2 + 3
< CPU eval('2 + 3')
< LLMOS Output of 2 + 3 is 5. Waiting for further USER input...
> Store result for later
< MEM STORE calculation_result=5
>
```
# Refs
- https://github.com/cpacker/MemGPT
- https://github.com/stanfordnlp/dspy
- https://github.com/langchain-ai/langchain