{"id":15809096,"url":"https://github.com/madbomber/aia","last_synced_at":"2025-06-19T04:39:30.844Z","repository":{"id":208914635,"uuid":"722774172","full_name":"MadBomber/aia","owner":"MadBomber","description":"AI Assistant (aia) a Ruby Gem for using genAI on the CLI","archived":false,"fork":false,"pushed_at":"2025-06-13T20:59:36.000Z","size":896,"stargazers_count":20,"open_issues_count":3,"forks_count":2,"subscribers_count":3,"default_branch":"main","last_synced_at":"2025-06-13T21:19:22.147Z","etag":null,"topics":["gem","genai","prompt","prompt-engineering","ruby"],"latest_commit_sha":null,"homepage":"","language":"Ruby","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/MadBomber.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2023-11-23T23:22:40.000Z","updated_at":"2025-06-09T12:11:08.000Z","dependencies_parsed_at":"2023-12-13T04:21:04.102Z","dependency_job_id":"bd6d37fe-03a1-4e91-96c4-1bf6fcc386f5","html_url":"https://github.com/MadBomber/aia","commit_stats":null,"previous_names":["madbomber/aia"],"tags_count":50,"template":false,"template_full_name":null,"purl":"pkg:github/MadBomber/aia","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MadBomber%2Faia","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MadBomber%2Faia/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MadBomber%2Faia/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MadBomber%2Faia/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/MadBomber","download_url":"https://codeload.github.com/MadBomber/aia/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MadBomber%2Faia/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":260689817,"owners_count":23047048,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["gem","genai","prompt","prompt-engineering","ruby"],"created_at":"2024-10-05T03:09:16.380Z","updated_at":"2025-06-19T04:39:25.833Z","avatar_url":"https://github.com/MadBomber.png","language":"Ruby","readme":"# AI Assistant (AIA)\n\n`aia` is a command-line utility that facilitates interaction with AI models. It automates the management of pre-compositional prompts and executes generative AI (Gen-AI) commands on those prompts taking advantage of modern LLMs increased context window size.\n\nIt leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt` CLI utilities. It utilizes \"ripgrep\" for searching for prompt files.  It uses `fzf` for prompt selection based on a search term and fuzzy matching.\n\n**Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)\n\n\u003e Just an FYI ... I am working in the `develop` branch to **drop the dependency on backend LLM processors like mods and llm.**  I'm refactor aia to use my own universal client gem called ai_client which gives access to all models and all providers.\n\n\n\n\u003c!-- Tocer[start]: Auto-generated, don't remove. --\u003e\n\n## Table of Contents\n\n  - [Installation](#installation)\n  - [Usage](#usage)\n  - [Configuration Using Envars](#configuration-using-envars)\n  - [Shell Integration inside of a Prompt](#shell-integration-inside-of-a-prompt)\n      - [Access to System Environment Variables](#access-to-system-environment-variables)\n      - [Dynamic Shell Commands](#dynamic-shell-commands)\n      - [Chat Session Use](#chat-session-use)\n  - [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)\n    - [Chat Session Behavior](#chat-session-behavior)\n  - [Prompt Directives](#prompt-directives)\n    - [Parameter and Shell Substitution in Directives](#parameter-and-shell-substitution-in-directives)\n    - [`aia` Specific Directive Commands](#aia-specific-directive-commands)\n      - [//config](#config)\n      - [//include](#include)\n      - [//ruby](#ruby)\n      - [//shell](#shell)\n    - [Backend Directive Commands](#backend-directive-commands)\n    - [Using Directives in Chat Sessions](#using-directives-in-chat-sessions)\n  - [Prompt Sequences](#prompt-sequences)\n    - [--next](#--next)\n    - [--pipeline](#--pipeline)\n    - [Best Practices ??](#best-practices-)\n    - [Example pipline](#example-pipline)\n  - [All About ROLES](#all-about-roles)\n    - [The --roles_dir (AIA_ROLES_DIR)](#the---roles_dir-aia_roles_dir)\n    - [The --role Option](#the---role-option)\n    - [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)\n  - [External CLI Tools Used](#external-cli-tools-used)\n    - [Optional External CLI-tools](#optional-external-cli-tools)\n      - [Backend Processor `llm`](#backend-processor-llm)\n      - [Backend Processor `sgpt`](#backend-processor-sgpt)\n      - [Occassionally Useful Tool `plz`](#occassionally-useful-tool-plz)\n  - [Shell Completion](#shell-completion)\n  - [My Most Powerful Prompt](#my-most-powerful-prompt)\n  - [My Configuration](#my-configuration)\n  - [Executable Prompts](#executable-prompts)\n  - [Development](#development)\n  - [Contributing](#contributing)\n  - [License](#license)\n\n\u003c!-- Tocer[finish]: Auto-generated, don't remove. --\u003e\n\n\n## Installation\n\nInstall the gem by executing:\n\n    gem install aia\n\n\nInstall the command-line utilities by executing:\n\n    brew install mods fzf ripgrep\n\nYou will also need to establish a directory in your file system where your prompt text files, last used parameters and usage log files are kept.\n\nSetup a system environment variable (envar) named \"AIA_PROMPTS_DIR\" that points to your prompts directory.  The default is in your HOME directory named \".prompts\". The envar \"AIA_ROLES_DIR\" points to your role directory where you have prompts that define the different roles you want the LLM to assume when it is doing its work.  The default roles directory is inside the prompts directory.  Its name is \"roles\".\n\nYou may also want to install the completion script for your shell.  To get a copy of the completion script do:\n\n`aia --completion bash`\n\n`fish` and `zsh` are also available.\n\n\n## Usage\n\nThe usage report obtained using either `-h` or `--help` is implemented as a standard `man` page.  You can use both `--help --verbose` of `-h -v` together to get not only the `aia` man page but also the usage report from the `backend` LLM processing tool.\n\n```shell\n$ aia --help\n```\n\n## Configuration Using Envars\n\nThe `aia` configuration defaults can be over-ridden by system environment variables *(envars)* with the prefix \"AIA_\" followed by the config item name also in uppercase. All configuration items can be over-ridden in this way by an envar.  The following table show a few examples.\n\n| Config Item   | Default Value | envar key |\n| ------------- | ------------- | --------- |\n| backend       | mods          | AIA_BACKEND |\n| config_file   | nil           | AIA_CONFIG_FILE |\n| debug         | false         | AIA_DEBUG |\n| edit          | false         | AIA_EDIT |\n| extra         | ''            | AIA_EXTRA |\n| fuzzy         | false         | AIA_FUZZY |\n| log_file      | ~/.prompts/_prompts.log | AIA_LOG_FILE |\n| markdown      | true          | AIA_MARKDOWN |\n| model         | gpt-4-1106-preview | AIA_MODEL |\n| out_file      | STDOUT        | AIA_OUT_FILE |\n| prompts_dir   | ~/.prompts    | AIA_PROMPTS_DIR |\n| speech_model. | tts-1         | AIA_SPEECH_MODEL |\n| verbose       | FALSE         | AIA_VERBOSE |\n| voice         | alloy         | AIA_VOICE |\n\n\n\nSee the `@options` hash in the `cli.rb` file for a complete list.  There are some config items that do not necessarily make sense for use as an envar over-ride.  For example if you set `export AIA_DUMP_FILE=config.yaml` then `aia` would dump the current configuration config.yaml and exit every time it is ran until you finally `unset AIA_DUMP_FILE`\n\nIn addition to these config items for `aia` the optional command line parameters for the backend prompt processing utilities (mods and sgpt) can also be set using envars with the \"AIA_\" prefix.  For example \"export AIA_TOPP=1.0\" will set the \"--topp 1.0\" command line option for the `mods` utility when its used as the backend processor.\n\n## Shell Integration inside of a Prompt\n\nUsing the option `--shell` enables `aia` to access your terminal's shell environment from inside the prompt text.\n\n#### Access to System Environment Variables\n\n`aia` can replace any system environment variable (envar) references in the prompt text with the value of the envar.  Patterns like $USER and ${USER} in the prompt will be replaced with that envar's value - the name of the user's account.  Any envar can be used.\n\n#### Dynamic Shell Commands\n\nDynamic content can be inserted into the prompt using the pattern $(shell command) where the output of the shell command will replace the $(...) pattern.\n\nConsider the power to tailoring a prompt to your specific operating system:\n\n```\nAs a system administration on a $(uname -v) platform what is the best way to [DO_SOMETHING]\n```\n\nor insert content from a file in your home directory:\n\n```\nGiven the following constraints $(cat ~/3_laws_of_robotics.txt) determine the best way to instruct my roomba to clean my kids room.\n```\n\n#### Chat Session Use\n\nWhen you use the `--shell` option to start a chat session, shell integration is available in your follow up prompts.  Suppose you started up a chat session using a roll of \"Ruby Expert\" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started.  You could enter this as your follow up prompt to this to keep going:\n\n```\nThe class I want to chat about refactoring is this one: $(cat my_class.rb)\n```\n\nThat inserts the entire class source file into your follow up prompt.  You can continue chatting with you AI Assistant avout changes to the class.\n\n## *E*mbedded *R*u*B*y (ERB)\n\nThe inclusion of dynamic content through the shell integration provided by the `--shell` option is significant.  `aia` also provides the full power of embedded Ruby code processing within the prompt text.\n\nThe `--erb` option turns the prompt text file into a fully functioning ERB template. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.\n\nMost websites that have information about ERB will give examples of how to use ERB to generate dynamice HTML content for web-based applications.  That is a common use case for ERB.  `aia` on the other hand uses ERB to generate dynamic prompt text.\n\n### Chat Session Behavior\n\nIn a chat session whether started by the `--chat` option or its equivalent with a directive within a prompt text file behaves a little differently w/r/t its binding and local variable assignments.  Since a chat session by definition has multiple prompts, setting a local variable in one prompt and expecting it to be available in a subsequent prompt does not work.  You need to use instance variables to accomplish this prompt to prompt carry over of information.\n\nAlso since follow up prompts are expected to be a single thing - sentence or paragraph - terminated by a single return, its likely that ERB enhance will be of benefit; but, you may find a use for it.\n\n## Prompt Directives\n\nDownstream processing directives were added to the `prompt_manager` gem used by `au` at version 0.4.1.  These directives are lines in the prompt text file that begin with \"//\" having this pattern:\n\n```\n//command parameters\n```\n\nThere is no space between the \"//\" and the command.\n\n### Parameter and Shell Substitution in Directives\n\nWhen you combine prompt directives with prompt parameters and shell envar substitutions you can get some powerful compositional prompts.\n\nHere is an example of a pure generic directive.\n\n```\n//[DIRECTIVE_NAME] [DIRECTIVE_PARAMS]\n```\n\nWhen the prompt runs, you will be asked to provide a value for each of the parameters.  You could answer \"shell\" for the directive name and \"calc 22/7\" if you wanted a bad approximation of PI.\n\nTry this prompt file:\n```\n//shell calc [FORMULA]\n\nWhat does that number mean to you?\n```\n\n### `aia` Specific Directive Commands\n\nAt this time `aia` only has a few directives which are detailed below.\n\n#### //config\n\nThe `//config` directive within a prompt text file is used to tailor the specific configuration environment for the prompt.  All configuration items are available to have their values changed.  The order of value assignment for a configuration item starts with the default value which is replaced by the envar value which is replaced by the command line option value which is replaced by the value from the config file.\n\nThe `//config` is the last and final way of changing the value for a configuration item for a specific prompt.\n\nThe switch options are treated like booleans.  They are either `true` or `false`. Their name within the context of a `//config` directive always ends with a \"?\" character - question mark.\n\nTo set the value of a switch using ``//config` for example `--terse` or `--chat` to this:\n\n```\n//config chat? = true\n//config terse? = true\n```\n\nA configuration item such as `--out_file` or `--model` has an associated value on the command line.  To set that value with the `//config` directive do it like this:\n\n```\n//config model = gpt-3.5-turbo\n//config out_file = temp.md\n//config backend = mods\n```\n\nBTW: the \"=\" is completely options.  Its actuall ignored as is \":=\" if you were to choose that as your assignment operator.  Also the number of spaces between the item and the value is complete arbitrary.  I like to line things up so this syntax is just as valie:\n\n```\n//config model       gpt-3.5-turbo\n//config out_file    temp.md\n//config backend     mods\n//config chat?       true\n//config terse?      true\n//config model       gpt-4\n```\n\nNOTE: if you specify the same config item name more than once within the prompt file, its the last one which will be set when the prompt is finally process through the LLM.  For example in the example above `gpt-4` will be the model used.  Being first does not count in this case.\n\n#### //include\n\nExample:\n```\n//include path_to_file\n```\n\nThe `path_to_file` can be either absolute or relative.  If it is relative, it is achored at the PWD.  If the `path_to_file` includes envars, the `--shell` CLI option must be used to replace the envar in the directive with its actual value.\n\nThe file that is included will have any comments or directives excluded.  It is expected that the file will be a text file so that its content can be pre-pended to the existing prompt; however, if the file is a source code file (ex: file.rb) the source code will be included HOWEVER any comment line or line that starts with \"//\" will be excluded.\n\nTODO:  Consider adding a command line option `--include_dir` to specify the place from which relative files are to come.\n\n#### //ruby\nExample:\n```\n//ruby any_code_that_returns_an_instance_of_String\n```\n\nThis directive is in addition to ERB.  At this point the `//ruby` directive is limited by the current binding which is within the `AIA::Directives#ruby` method.  As such it is not likely to see much use.\n\nHowever, sinces it implemented as a simple `eval(code)` then there is a potential for use like this:\n```\n//ruby load(some_ruby_file); execute_some_method\n```\n\nEach execution of a `//ruby` directive will be a fresh execution of the `AIA::Directives#ruby` method so you cannot carry local variables from one invocation to another; however, you could do something with instance variables or global variables.  You might even add something to the `AIA.config` object to be pasted on to the next invocation of the directive within the context of the same prompt.\n\n#### //shell\nExample:\n```\n//shell some_shell_command\n```\n\nIt is expected that the shell command will return some text to STDOUT which will be pre-pending to the existing prompt text within the prompt file.\n\nThere are no limitations on what the shell command can be.  For example if you wanted to bypass the stripping of comments and directives from a file you could do something like this:\n```\n//shell cat path_to_file\n```\n\nWhich does basically the same thing as the `//include` directive, except it uses the entire content of the file.  For relative file paths the same thing applies.  The file's path will be relative to the PWD.\n\n\n\n### Backend Directive Commands\n\nSee the source code for the directives supported by the backends which at this time are configuration-based as well.\n\n- [mods](lib/aia/tools/mods.rb)\n- [sgpt](lib/aia/tools/sgpt.rb)\n\nFor example `mods` has a configuration item `topp` which can be set by a directive in a prompt text file directly.\n\n```\n//topp 1.5\n```\n\nIf `mods` is not the backend the `//topp` direcive is ignored.\n\n### Using Directives in Chat Sessions\n\nWhe you are in a chat session, you may use a directive as a follow up prompt.  For example if you started the chat session with the option `--terse` expecting to get short answers from the backend; but, then you decide that you want more comprehensive answers you may do this:\n\n```\n//config terse? false\n```\n\nThe directive is executed and a new follow up prompt can be entered with a more lengthy response generated from the backend.\n\n\n## Prompt Sequences\n\nWhy would you need/want to use a sequence of prompts in a batch situation.  Maybe you have a complex prompt which exceeds the token limitations of your model for input so you need to break it up into multiple parts.  Or suppose its a simple prompt but the number of tokens on the output is limited and you do not get exactly the kind of full response for which you were looking.\n\nSometimes it takes a series of prompts to get the kind of response that you want.  The reponse from one prompt becomes a context for the next prompt.  This is easy to do within a `chat` session were you are manually entering and adjusting your prompts until you get the kind of response that you want.\n\nIf you need to do this on a regular basis or within a batch you can use `aia` and the `--next` and `--pipeline` command line options.\n\nThese two options specify the sequence of prompt IDs to be processed. Both options are available to be used within a prompt file using the `//config` directive.  Like all embedded directives you can take advantage of parameterization shell integration and Ruby.  I'm start to feel like TIm Tool man - more power!\n\nConsider the condition in which you have 4 prompt IDs that need to be processed in sequence.  The IDs and associated prompt file names are:\n\n| Promt ID | Prompt File |\n| -------- | ----------- |\n| one.     | one.txt     |\n| two.     | two.txt     |\n| three.   | three.txt   |\n| four.    | four.txt    |\n\n\n### --next\n\n```shell\nexport AIA_OUT_FILE=temp.md\naia one --next two\naia three --next four temp.md\n```\n\nor within each of the prompt files you use the config directive:\n\n```\none.txt contains //config next two\ntwo.txt contains //config next three\nthree.txt contains //config next four\n```\nBUT if you have more than two prompts in your sequence then consider using the --pipeline option.\n\n**The directive //next is short for //config next**\n\n### --pipeline\n\n`aia one --pipeline two,three,four`\n\nor inside of the `one.txt` prompt file use this directive:\n\n`//config pipeline two,three,four`\n\n**The directive //pipeline is short for //config pipeline**\n\n### Best Practices ??\n\nSince the response of one prompt is fed into the next prompt within the sequence instead of having all prompts write their response to the same out file, use these directives inside the associated prompt files:\n\n\n| Prompt File | Directive |\n| --- | --- |\n| one.txt | //config out_file one.md |\n| two.txt | //config out_file two.md |\n| three.txt | //config out_file three.md |\n| four.txt | //config out_file four.md |\n\nThis way you can see the response that was generated for each prompt in the sequence.\n\n### Example pipline\n\nTODO: the audio-to-text is still under development.\n\nSuppose you have an audio file of a meeting.  You what to get a transcription of what was said in that meeting.  Sometimes raw transcriptions hide the real value of the recording so you have crafted a pompt that takes the raw transcriptions and does a technical summary with a list of action items.\n\nCreate two prompts named transcribe.txt and tech_summary.txt\n\n```\n# transcribe.txt\n# Desc: takes one audio file\n# note that there is no \"prompt\" text only the directive\n\n//config backend  client\n//config model    whisper-1\n//next            tech_summary\n```\nand\n\n```\n# tech_summary.txt\n\n//config model    gpt-4-turbo\n//config out_file meeting_summary.md\n\nReview the raw transcript of a technical meeting,\nsummarize the discussion and\nnote any action items that were generated.\n\nFormat your response in markdown.\n```\n\nNow you can do this:\n\n```\naia transcribe my_tech_meeting.m4a\n```\n\nYou summary of the meeting is in the file `meeting_summary.md`\n\n\n## All About ROLES\n\n### The --roles_dir (AIA_ROLES_DIR)\n\nThere are two kinds of prompts\n1. instructional - tells the LLM what to do\n2. personification - tells the LLM who it should pretend to be when it does its transformational work.\n\nThat second kind of prompt is called a role.  Sometimes the role is incorporated into the instruction.  For example \"As a magician make a rabbit appear out of a hat.\"  To reuse the same role in multiple prompts `aia` encourages you to designate a special `roles_dir` into which you put prompts that are specific to personification - roles.\n\nThe default `roles_dir` is a sub-directory of the `prompts_dir` named roles.  You can, however, put your `roles_dir` anywhere that makes sense to you.\n\n### The --role Option\n\nThe `--role` option is used to identify a personification prompt within your roles directory which defines the context within which the LLM is to provide its response.  The text of the role ID is pre-pended to the text of the primary prompt to form a complete prompt to be processed by the backend.\n\nFor example consider:\n\n```shell\naia -r ruby refactor my_class.rb\n```\n\nWithin the roles directory the contents of the text file `ruby.txt` will be pre-pre-pended to the contents of the `refactor.txt` file from the prompts directory to produce a complete prompt.  That complete prompt will have any parameters followed by directives processed before sending the combined prompt text to the backend.\n\nNote that `--role` is just a way of saying add this prompt text file to the front of this other prompt text file.  The contents of the \"role\" prompt could be anything.  It does not necessarily have be an actual role.\n\n`aia` fully supports a directory tree within the `prompts_dir` as a way of organization or classification of your different prompt text files.\n\n```shell\naia -r sw_eng doc_the_methods my_class.rb\n```\n\nIn this example the prompt text file `$AIA_ROLES_DIR/sw_eng.txt` is prepended to the prompt text file `$AIA_PROMPTS_DIR/doc_the_methods.txt`\n\n\n### Other Ways to Insert Roles into Prompts\n\nSince `aia` supports parameterized prompts you could make a keyword like \"[ROLE]\" be part of your prompt.  For example consider this prompt:\n\n```text\nAs a [ROLE] tell me what you think about [SUBJECT]\n```\n\nWhen this prompt is processed, `aia` will ask you for a value for the keyword \"ROLE\" and the keyword \"SUBJECT\" to complete the prompt.  Since `aia` maintains a history of your previous answers, you could just choose something that you used in the past or answer with a completely new value.\n\n## External CLI Tools Used\n\nTo install the external CLI programs used by aia:\n\n  brew install fzf mods rg glow\n\nfzf\n  Command-line fuzzy finder written in Go\n  [https://github.com/junegunn/fzf](https://github.com/junegunn/fzf)\n\nmods\n  AI on the command-line\n  [https://github.com/charmbracelet/mods](https://github.com/charmbracelet/mods)\n\nrg\n  Search tool like grep and The Silver Searcher\n  [https://github.com/BurntSushi/ripgrep](https://github.com/BurntSushi/ripgrep)\n\nglow\n  Render markdown on the CLI\n  [https://github.com/charmbracelet/glow](https://github.com/charmbracelet/glow)\n\nA text editor whose executable is setup in the\nsystem environment variable 'EDITOR' like this:\n\n  export EDITOR=\"subl -w\"\n\n### Optional External CLI-tools\n\n#### Backend Processor `llm`\n\n```\nllm  Access large language models from the command-line\n     |   brew install llm\n     |__ https://llm.datasette.io/\n```\n\nAs of `aia v0.5.13` the `llm` backend processor is available in a limited integration.  It is a very powerful python-based implementation that has its own prompt templating system.  The reason that it is be included within the `aia` environment is for its ability to make use of local LLM models.\n\n\n#### Backend Processor `sgpt`\n\n`shell-gpt` aka `sgpt` is also a python implementation of a CLI-tool that processes prompts through OpenAI.  It has less features than both `mods` and `llm` and is less flexible.\n\n#### Occassionally Useful Tool `plz`\n\n`plz-cli` aka `plz` is not integrated with `aia` however, it gets an honorable mention for its ability to except a prompt that tailored to doing something on the command line.  Its response is a CLI command (sometimes a piped sequence) that accomplishes the task set forth in the prompt.  It will return the commands to be executed agaist the data files you specified with a query to execute the command.\n\n- brew install plz-cli\n\n## Shell Completion\n\nYou can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available.  To get a copy of these functions do this:\n\n```shell\naia --completion bash\n```\n\nIf you're not a fan of \"born again\" replace `bash` with one of the others.\n\nCopy the function to a place where it can be installed in your shell's instance.  This might be a `.profile` or `.bashrc` file, etc.\n\n## My Most Powerful Prompt\n\nThis is just between you and me so don't go blabbing this around to everyone.  My most power prompt is in a file named `ad_hoc.txt`. It looks like this:\n\n\u003e [WHAT NOW HUMAN]\n\nYep.  Just a single parameter for which I can provide a value of anything that is on my mind at the time.  Its advantage is that I do not pollute my shell's command history with lots of text.\n\nWhich do you think is better to have in your shell's history file?\n\n```shell\nmods \"As a certified public accountant specializing in forensic audit and analysis of public company financial statements, what do you think of mine?  What is the best way to hide the millions dracma that I've skimmed?\"  \u003c financial_statement.txt\n```\n\nor\n\n```shell\naia ad_hoc financial_statement.txt\n```\n\nBoth do the same thing; however, `aia` does not put the text of the prompt into the shell's history file.... of course the keyword/parameter value is saved in the prompt's JSON file and the prompt with the response are logged unless `--no-log` is specified; but, its not messing up the shell history!\n\n## My Configuration\n\nI use the `bash` shell.  In my `.bashrc` file I source another file named `.bashrc__aia` which looks like this:\n\n```shell\n# ~/.bashic_aia\n# AI Assistant\n\n# These are the defaults:\nexport AIA_PROMPTS_DIR=~/.prompts\nexport AIA_OUT_FILE=./temp.md\nexport AIA_LOG_FILE=$AIA_PROMPTS_DIR/_prompts.log\nexport AIA_BACKEND=mods\nexport AIA_MODEL=gpt-4-1106-preview\n\n# Not a default.  Invokes spinner.\nexport AIA_VERBOSE=true\n\nalias chat='aia chat --terse'\n\n# rest of the file is the completion function\n```\n\nHere is what my `chat` prompt file looks like:\n\n```shell\n# ~/.prompts/chat.txt\n# Desc: Start a chat session\n\n//config chat? = true\n\n[WHAT]\n```\n\n## Executable Prompts\n\nWith all of the capabilities of the AI Assistant, you can create your own executable prompts. These prompts can be used to automate tasks, generate content, or perform any other action that you can think of.  All you need to get started with executable prompts is a prompt that does not do anything.  For example, consider my `run.txt` prompt.\n\n```\n# ~/.prompts/run.txt\n# Desc: Run executable prompts coming in via STDIN\n```\n\nRemember that the '#' character indicates a comment line making the `run` prompt ID basically a do nothing prompt.\n\nAn executable prompt can reside anywhere either in your $PATH or not.  That is your choice.  It must however be executable.  Consider the following `top10` executable prompt:\n\n```\n#!/usr/bin/env aia run --no-out_file\n# File: top10\n# Desc: The tope 10 cities by population\n\nwhat are the top 10 cities by population in the USA. Summarize what people\nlike about living in each city. Include an average cost of living. Include\nlinks to the Wikipedia pages.  Format your response as a markdown document.\n```\n\nMake sure that it is executable.\n\n```shell\nchmod +x top10\n```\n\nThe magic is in the first line of the prompt.  It is a shebang line that tells the system how to execute the prompt.  In this case it is telling the system to use the `aia` command line tool to execute the `run` prompt.  The `--no-out_file` option tells the `aia` command line tool not to write the output of the prompt to a file.  Instead it will write the output to STDOUT.  The remaining content of this `top10` prompt is send via STDIN to the configured backend LLM processor.\n\nNow just execute it like any other command in your terminal.\n\n```shell\n./top10\n```\n\nSince its output is going to STDOUT you can setup a pipe chain.  Using the CLI program `glow` to render markdown in the terminal\n(brew install glow)\n\n```shell\n./top10 | glow\n```\n\nThis executable prompt concept sets up the building blocks of a *nix CLI-based pipeline in the same way that the --pipeline and --next options and directives are used.\n\n## Development\n\nThis CLI tool started life as a few lines of ruby in a file in my scripts repo.  I just kep growing as I decided to add more capability and more backend tools.  There was no real architecture to guide the design.  What was left is a large code mess which is slowly being refactored into something more maintainable.  That work is taking place in the `develop` branch.  I welcome you help.  Take a look at what is going on in that branch and send me a PR against it.\n\nOf course if you see something in the main branch send me a PR against that one so that we can fix the problem for all.\n\n## Contributing\n\nBug reports and pull requests are welcome on GitHub at https://github.com/MadBomber/aia.\n\nWhen you find problems with `aia` please note them as an issue.  This thing was written mostly by a human and you know how error prone humans are.  There should be plenty of errors to find.\n\nI'm not happy with the way where some command line options for external command are hard coded.  I'm specific talking about the way in which the `rg` and `fzf` tools are used.  There options decide the basic look and feel of the search capability on the command line.  Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.\n\n## License\n\nThe gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmadbomber%2Faia","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmadbomber%2Faia","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmadbomber%2Faia/lists"}