Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/fynnfluegge/doc-comments-ai
LLM-powered code documentation generation
https://github.com/fynnfluegge/doc-comments-ai
docstring documentation-tool gpt-35-turbo gpt-4 javadoc jsdoc llm ollama openai rustdoc
Last synced: about 1 month ago
JSON representation
LLM-powered code documentation generation
- Host: GitHub
- URL: https://github.com/fynnfluegge/doc-comments-ai
- Owner: fynnfluegge
- License: mit
- Created: 2023-08-17T22:21:55.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-04T00:25:19.000Z (6 months ago)
- Last Synced: 2024-04-05T00:00:30.419Z (6 months ago)
- Topics: docstring, documentation-tool, gpt-35-turbo, gpt-4, javadoc, jsdoc, llm, ollama, openai, rustdoc
- Language: Python
- Homepage:
- Size: 383 KB
- Stars: 107
- Watchers: 3
- Forks: 19
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Code documentation generation with LLMs
[![Build](https://github.com/fynnfluegge/doc-comments.ai/actions/workflows/build.yaml/badge.svg)](https://github.com/fynnfluegge/doc-comments.ai/actions/workflows/build.yaml)
[![Publish](https://github.com/fynnfluegge/doc-comments.ai/actions/workflows/publish.yaml/badge.svg)](https://github.com/fynnfluegge/doc-comments.ai/actions/workflows/publish.yaml)Focus on writing your code, let LLMs write the documentation for you.
With just a few keystrokes in your terminal by using OpenAI or 100% local LLMs without any data leaks.Built with [langchain](https://github.com/langchain-ai/langchain), [treesitter](https://github.com/tree-sitter/tree-sitter), [lama.cpp](https://github.com/ggerganov/llama.cpp) and [ollama](https://github.com/jmorganca/ollama)
![doc_comments_ai_demo](https://github.com/fynnfluegge/doc-comments-ai/assets/16321871/664bc581-a2a0-49ea-87f9-343f49f05e97)
## β¨ Features
- π Β Generate documentation comment blocks for all methods in a file
- e.g. Javadoc, JSDoc, Docstring, Rustdoc etc.
- βοΈ Β Generate inline documentation comments in method bodies
- π³Β Treesitter integration
- π»Β Local LLM support
- πΒ Azure OpenAI support> [!NOTE]
> Documentation will only be added to files without unstaged changes, so nothing is overwritten.## π Usage
Create documentations for any method in a file specified by `` with GPT-3.5-Turbo model:
```
aicomment
```Create also documentation comments in the method body:
```
aicomment --inline
```
Guided mode, confirm documentation generation for each method:
```
aicomment --guided
```Use GPT-4 model:
```
aicomment --gpt4
```Use GPT-3.5-Turbo-16k model:
```
aicomment --gpt3_5-16k
```Use Azure OpenAI:
```
aicomment --azure-deployment
```Use local Llama.cpp:
```
aicomment --local_model
```
Use local Ollama:
```
aicomment --ollama-model
```> [!NOTE]
> How to download models from huggingface for local usage see [Local LLM usage](https://github.com/fynnfluegge/doc-comments-ai#3-local-llm-usage)> [!NOTE]
> If very extensive and descriptive documentations are needed, consider using GPT-4/GPT-3.5 Turbo 16k or a similar local model.> [!IMPORTANT]
> The results by using a local LLM will highly be affected by your selected model. To get similar results compared to GPT-3.5/4 you need to select very large models which require a powerful hardware.## π Supported Languages
- [x] Python
- [x] Typescript
- [x] Javascript
- [x] Java
- [x] Rust
- [x] Kotlin
- [x] Go
- [x] C++
- [x] C
- [x] C#
- [x] Haskell## π Requirements
- Python >= 3.9
## π¦ Installation
Install in an isolated environment with `pipx`:
```
pipx install doc-comments-ai
```
If you are facing issues using pipx uou can also install directly from source through PyPI with
```
pip install doc-comments-ai
```
However, it is recommended to use pipx instead to benefit from isolated environments for the dependencies.
For further help visit the [Troubleshooting](https://github.com/fynnfluegge/doc-comments-ai?tab=readme-ov-file#-troubleshooting) section.### 1. OpenAI usage
Create your personal OpenAI API key and add it as `$OPENAI_API_KEY` to your environment with:
```bash
export OPENAI_API_KEY =
```### 2. Azure OpenAI usage
Add the following variables to your environment:
```bash
export AZURE_API_BASE = "https://
export AZURE_API_VERSION = "2023-05-15"
```### 3. Local LLM usage with Llama.cpp
When using a local LLM no API key is required. On first usage of `--local_model` you will be asked for confirmation to intall `llama-cpp-python` with its dependencies.
The installation process will take care of the hardware-accelerated build tailored to your hardware and OS. For further details see:
[installation-with-hardware-acceleration](https://github.com/abetlen/llama-cpp-python#installation-with-hardware-acceleration)To download a model from huggingface for local usage the most convenient way is using the `huggingface-cli`:
```
huggingface-cli download TheBloke/CodeLlama-13B-Python-GGUF codellama-13b-python.Q5_K_M.gguf
```This will download the `codellama-13b-python.Q5_K_M` model to `~/.cache/huggingface/`.
After the download has finished the absolute path of the `.gguf` file is printed to the console which can be used as the value for `--local_model`.> [!IMPORTANT]
> Since `llama.cpp` is used the model must be in the `.gguf` format.## π Troubleshooting
- #### During installation with `pipx`
```
pip failed to build package: tiktokenSome possibly relevant errors from pip install:
error: subprocess-exited-with-error
error: can't find Rust compiler
```
Make sure the rust compiler is installed on your system from [here](https://www.rust-lang.org/tools/install).## π Contributing
If you are missing a feature or facing a bug don't hesitate to open an issue or raise a PR.
Any kind of contribution is highly appreciated!