Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/Sparky4567/obsidian_ai_plugin
Lets to use local llms in your Obsidian Vaults, create new texts from your prompts and crate texts based on your inputs
https://github.com/Sparky4567/obsidian_ai_plugin
ai llm local obsidian ollama plugin
Last synced: 10 days ago
JSON representation
Lets to use local llms in your Obsidian Vaults, create new texts from your prompts and crate texts based on your inputs
- Host: GitHub
- URL: https://github.com/Sparky4567/obsidian_ai_plugin
- Owner: Sparky4567
- License: apache-2.0
- Created: 2024-03-18T13:22:21.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-06-30T08:23:49.000Z (6 months ago)
- Last Synced: 2024-07-01T15:19:18.605Z (5 months ago)
- Topics: ai, llm, local, obsidian, ollama, plugin
- Language: TypeScript
- Homepage: https://github.com/Sparky4567/obsidian_ai_plugin
- Size: 118 KB
- Stars: 20
- Watchers: 1
- Forks: 2
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
- awesome-obsidian-ai-tools - https://github.com/Sparky4567/obsidian_ai_plugin
- awesome-obsidian-ai-tools - https://github.com/Sparky4567/obsidian_ai_plugin
README
## Obsidian AI plugin
This plugin was developed to make a better way to use Local LLM models with Obsidian.
## In order to use the plugin
- Download/Clone the plugin into your plugins folder
```
cd ./obsidian/plugins
git clone https://github.com/Sparky4567/obsidian_ai_plugin.git
cd obsidian_ai_plugin
npm install
npm run build
open Obsidian app
enable community plugin support
enable LLM plugin
choose a module within the settings tab```
- Ensure that you have Ollama installed
```
https://ollama.com/download
```
Read documentation accordingly.
- Ensure that Ollama model is running in the background
```
ollama run tinyllama
(Example)
```
- If you downloaded this plugin from GitHub repo, copy it to your .obsidian/plugins, don't forget to run npm install within the plugins directory
```
npm install```
to install all needed dependencies.
- Ensure that the plugin is activated
- Choose the right endpoint and model in plugins settings
- Write something into editor field (Simple text)
- Select the text with your mouse
- Press CTRL+P after selection
- Type in ASK LLM and choose your wanted command (There aren't many at the moment)
- Press Enter to confirm
- Wait for a while to get the resultYour text will be changed with the text from LLM (Default is tinyllama)
If you have any questions related to the plugin or want to extend the functionality, write an email to [email protected] and I will try to respond as soon, as I can.
### Recommendations
- A laptop with at least 8GB of RAM and a decent processor (for local usage)
### Want to support the project ?
[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/K3K06VU8Z)
[![wakatime](https://wakatime.com/badge/user/1fbc8005-b2d0-4f4f-93e8-f12d7d25d676/project/018e50a2-95fc-40fa-aed2-18be07c19419.svg)](https://wakatime.com/badge/user/1fbc8005-b2d0-4f4f-93e8-f12d7d25d676/project/018e50a2-95fc-40fa-aed2-18be07c19419)