Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/Realiserad/fish-ai
Supercharge your command line with LLMs and get shell scripting assistance in Fish. πͺ
https://github.com/Realiserad/fish-ai
fish-plugin fisher
Last synced: about 2 months ago
JSON representation
Supercharge your command line with LLMs and get shell scripting assistance in Fish. πͺ
- Host: GitHub
- URL: https://github.com/Realiserad/fish-ai
- Owner: Realiserad
- License: mit
- Created: 2024-01-05T11:22:49.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-07-26T07:52:45.000Z (2 months ago)
- Last Synced: 2024-07-26T09:14:29.085Z (2 months ago)
- Topics: fish-plugin, fisher
- Language: Python
- Homepage:
- Size: 252 KB
- Stars: 49
- Watchers: 6
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
![Badge with time spent](https://img.shields.io/endpoint?url=https%3A%2F%2Fgist.githubusercontent.com%2FRealiserad%2Fd3ec7fdeecc35aeeb315b4efba493326%2Fraw%2Ffish-ai-git-estimate.json)
# About
`fish-ai` adds AI functionality to [Fish](https://fishshell.com). It
has been tested with macOS and Linux, but should run on any system where
Python and git is installed.Originally based on [Tom DΓΆrr's `fish.codex` repository](https://github.com/tom-doerr/codex.fish),
but with some additional functionality.It can be hooked up to OpenAI, Azure OpenAI, Google, Hugging Face, Mistral or a
self-hosted LLM behind any OpenAI-compatible API.If you like it, please add a β. If you don't like it, create a PR. π
## π₯ Demo
![demo](https://github.com/Realiserad/fish-ai/assets/6617918/49d8a959-8f6c-48d8-b788-93c560617c28)
## π¨βπ§ How to install
### Create a configuration
Create a configuration file `~/.config/fish-ai.ini`.
If you use a self-hosted LLM:
```ini
[fish-ai]
configuration = self-hosted[self-hosted]
provider = self-hosted
server = https://:/v1
model =
api_key =
```If you are self-hosting, my recommendation is to use
[Ollama](https://github.com/ollama/ollama) with
[Llama 3.1 70B](https://ollama.com/library/llama3.1). An out of the box
configuration running on `localhost` could then look something
like this:```ini
[fish-ai]
configuration = local-llama[local-llama]
provider = self-hosted
server = http://localhost:11434/v1
```If you use [OpenAI](https://platform.openai.com):
```ini
[fish-ai]
configuration = openai[openai]
provider = openai
model = gpt-4o
api_key =
organization =
```If you use [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service):
```ini
[fish-ai]
configuration = azure[azure]
provider = azure
server = https://.openai.azure.com
model =
api_key =
```If you use [Gemini](https://ai.google.dev):
```ini
[fish-ai]
configuration = gemini[gemini]
provider = google
api_key =
```If you use [Hugging Face](https://huggingface.co):
```ini
[fish-ai]
configuration = huggingface[huggingface]
provider = huggingface
email =
password =
model = meta-llama/Meta-Llama-3.1-405B-Instruct-FP8
```Available models are listed [here](https://huggingface.co/chat/models).
Note that 2FA must be disabled on the account.If you use [Mistral](https://mistral.ai):
```ini
[fish-ai]
configuration = mistral[mistral]
provider = mistral
api_key =
```### Install `fish-ai`
Install the plugin. You can install it using [`fisher`](https://github.com/jorgebucaran/fisher).
```shell
fisher install realiserad/fish-ai
```## π How to use
### Transform comments into commands and vice versa
Type a comment (anything starting with `#`), and press **Ctrl + P** to turn it
into shell command!You can also run it in reverse. Type a command and press **Ctrl + P** to turn it
into a comment explaining what the command does.### Autocomplete commands
Begin typing your command and press **Ctrl + Space** to display a list of
completions in [`fzf`](https://github.com/junegunn/fzf) (it is bundled
with the plugin, no need to install it separately). Completions load in the
background and show up as they become available.### Suggest fixes
If a command fails, you can immediately press **Ctrl + Space** at the command prompt
to let `fish-ai` suggest a fix!## π€Έ Additional options
You can tweak the behaviour of `fish-ai` by putting additional options in your
`fish-ai.ini` configuration file.### Explain in a different language
To explain shell commands in a different language, set the `language` option
to the name of the language. For example:```ini
[fish-ai]
language = Swedish
```This will only work well if the LLM you are using has been trained on a dataset
with the chosen language.### Change the temperature
Temperature is a decimal number between 0 and 1 controlling the randomness of
the output. Higher values make the LLM more creative, but may impact accuracy.
The default value is `0.2`.Here is an example of how to increase the temperature to `0.5`.
```ini
[fish-ai]
temperature = 0.5
```This option is not supported when using the `huggingface` provider.
### Number of completions
To change the number of completions suggested by the LLM when pressing
**Ctrl + Space**, set the `completions` option. The default value is `5`.Here is an example of how you can increase the number of completions to `10`:
```ini
[fish-ai]
completions = 10
```### Personalise completions using commandline history
You can personalise completions suggested by the LLM by sending
an excerpt of your commandline history.To enable it, specify the maximum number of commands from the history
to send to the LLM using the `history_size` option. The default value
is `0` (do not send any commandline history).```ini
[fish-ai]
history_size = 5
```## π Switch between contexts
You can switch between different sections in the configuration using the
`fish_ai_switch_context` command.## πΎ Data privacy
When using the plugin, `fish-ai` submits the name of your OS and the
commandline buffer to the LLM.When you codify a command, it also sends the contents of any files you
mention (as long as the file is readable), and when you explain or
autocomplete a command, the the ouput from ` --help` is provided
to the LLM for reference.`fish-ai` can also send an exerpt of your commandline history
when autocompleting a command. This is disabled by default.Finally, to fix the previous command, the previous commandline buffer,
along with any terminal output and the corresponding exit code is sent
to the LLM.If you are concerned with data privacy, you should use a self-hosted
LLM. When hosted locally, no data ever leaves your machine.### Redaction of sensitive information
The plugin attempts to redact sensitive information from the prompt
before submitting it to the LLM. Sensitive information is replaced by
the `` placeholder.The following information is redacted:
- Passwords and API keys supplied on the commandline.
- Base64 encoded data in single or double quotes.
- PEM-encoded private keys.## π¨ Development
This repository ships with a `devcontainer.json` which can be used with
GitHub Codespaces or Visual Studio Code with
[the Dev Containers extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers).To install `fish-ai` from a local copy, use `fisher`:
```shell
fisher install .
```### Enable debug logging
Enable debug logging by putting `debug = True` in your `fish-ai.ini`.
Logging is done to syslog by default (if available). You can also enable
logging to file using `log = `, for example:```ini
[fish-ai]
debug = True
log = ~/.fish-ai/log.txt
```### Run the tests
[The installation tests](https://github.com/Realiserad/fish-ai/actions/workflows/installation-tests.yaml)
are packaged into containers and can be executed locally with e.g. `docker`.```shell
docker build -f tests/ubuntu-22/Dockerfile .
docker build -f tests/fedora-40/Dockerfile .
docker build -f tests/archlinux-base/Dockerfile .
```The Python modules containing most of the business logic can be tested using
`pytest`.### Create a release
A release is created by GitHub Actions when a new tag is pushed.
```shell
set tag (grep '^version =' pyproject.toml | \
cut -d '=' -f2- | \
string replace -ra '[ "]' '')
git tag -a "v$tag" -m "π v$tag"
git push origin "v$tag"
```