Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/eli64s/readme-ai
README file generator, powered by AI.
https://github.com/eli64s/readme-ai
ai ai-documentation anthropic badge-generator cli developer-tools devtools documentation documentation-generator gemini gpt hacktoberfest markdown markdown-generator python readme readme-generator readme-md readme-md-generator readme-template
Last synced: 5 days ago
JSON representation
README file generator, powered by AI.
- Host: GitHub
- URL: https://github.com/eli64s/readme-ai
- Owner: eli64s
- License: mit
- Created: 2023-01-16T19:47:17.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2024-10-27T08:36:38.000Z (3 months ago)
- Last Synced: 2024-10-29T14:51:17.288Z (3 months ago)
- Topics: ai, ai-documentation, anthropic, badge-generator, cli, developer-tools, devtools, documentation, documentation-generator, gemini, gpt, hacktoberfest, markdown, markdown-generator, python, readme, readme-generator, readme-md, readme-md-generator, readme-template
- Language: Python
- Homepage: https://eli64s.github.io/readme-ai/
- Size: 219 MB
- Stars: 1,540
- Watchers: 15
- Forks: 166
- Open Issues: 29
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- Awesome-ChatGPT - README-AI - line tool for crafting aesthetic, structured, and informative README.md files, powered by OpenAI's language model API. (Developer Libraries, SDKs, and APIs / Python)
- awesome-ccamel - eli64s/readme-ai - README file generator, powered by AI. (Python)
- awesome-chatgpt - README-AI - line tool for crafting aesthetic, structured, and informative README.md files, powered by OpenAI's language model API. (Developer Libraries, SDKs, and APIs / Python)
- awesome-ai-devtools - README-AI
- StarryDivineSky - eli64s/readme-ai
- awesome-chatgpt - eli64s/readme-ai - Generate beautiful README files from the terminal using OpenAI's GPT language models (ChatGPT-based applications / Other sdk/libraries)
- awesome-list - eli64s/readme-ai - README file generator, powered by AI. (Python)
- project-awesome - eli64s/readme-ai - README file generator, powered by AI. (Python)
- awesome-ChatGPT-repositories - README-AI - 🚀 CLI tool that generates beautiful and informative README Markdown files. Powered by OpenAI's GPT APIs 💫 (CLIs)
- awesome-ai-dev-tools - README-AI - Automated README.md file generator. (Documentation Tools / IDE Extensions)
README
## Quick Links
- [Intro](#introduction)
- [Demo](#demo)
- [Features](#features)
- [Quickstart](#getting-started)
- [Configuration](#configuration)
- [Example Gallery](#example-gallery)
- [Contributing Guidelines](#contributing)> [!IMPORTANT]
> Explore the [Official Documentation][docs] for a complete list of features, customization options, and examples.## Introduction
ReadmeAI is a developer tool that automatically generates README files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.
**Why Use ReadmeAI?**
This project aims to streamline the process of creating and maintaining documentation across all technical disciplines and experience levels. The core principles include:
- **🔵 Automate:** Generate detailed and structured README files with a single command.
- **⚫️ Customize:** Select from a variety of templates, styles, badges, and much more.
- **🟣 Flexible:** Switch between `OpenAI`, `Ollama`, `Anthropic`, and `Gemini` anytime.
- **🟠 Language Agnostic:** Compatible with a wide range of languages and frameworks.
- **🟡 Best Practices:** Ensure clean and consistent documentation across all projects.
- **✨ Offline Mode:** Create README files offline, without using a LLM API service.## Demo
**Run from your terminal:**
[readmeai-cli-demo][cli-demo]
## Features
### Customize Your README
Let's begin by exploring various customization options and styles supported by ReadmeAI:
Header Styles
CLI Command:
$ readmeai --repository https://github.com/eli64s/readme-ai-streamlit \
 --logo custom \
 --badge-color FF4B4B \
 --badge-style flat-square \
 --header-style classic
CLI Command:
$ readmeai --repository https://github.com/olliefr/docker-gs-ping \
 --badge-color 00ADD8 \
 --badge-style for-the-badge \
 --header-style modern \
 --navigation-style roman
CLI Command:
$ readmeai --repository https://github.com/rumaan/file.io-Android-Client \
 --badge-style plastic \
 --badge-color blueviolet \
 --logo PURPLE \
 --header-style COMPACT \
 --navigation-style NUMBER \
 --emojis solar
Banner Styles
CLI Command:
$ readmeai --repository https://github.com/emcf/thepipe \
 --badge-style flat-square \
 --badge-color 8a2be2 \
 --header-style console \
 --navigation-style accordion \
 --emojis water
CLI Command:
$ readmeai --repository https://github.com/FerrariDG/async-ml-inference \
 --badge-style plastic \
 --badge-color 43a047 \
 --header-style BANNER
And More!
CLI Command:
$ readmeai --repository 'https://github.com/eli64sreadme-ai-streamlit' \
 --badge-style FLAT-SQUARE \
 --badge-color E92063 \
 --header-style COMPACT \
 --navigation-style ACCORDION \
 --emojis RAINBOW \
 --logo ICE
CLI Command:
$ readmeai --repository https://github.com/jwills/buenavista \
 --align LEFT \
 --badge-style FLAT-SQUARE \
 --logo CUSTOM
### Generated Sections & Content
꩜ Expand to view more!
|
Project Introduction
- This section captures your project's essence and value proposition.
- The prompt template used to generate this section can be viewed [here][prompts.toml].
| :--- |
| ![][project-overview] |
|
Features Table
- Detailed feature breakdown and technical capabilities.
- The prompt template used to generate this section can be viewed [here][prompts.toml].
| :--- |
| ![][features-table] |
|
Project Structure
- Visual representation of your project's directory structure.
- The tree is generated using [pure Python][tree.py] and embedded in a code block.
| :--- |
| ![][project-structure] |
|
Project Index
- Summarizes key modules of the project, which are also used as context for downstream [prompts.toml][prompts.toml].
| ![][project-index] |
|
Getting Started Guides
- Dependencies and system requirements are extracted from the codebase during preprocessing.
- The [parsers][readmeai.parsers] handle most of the heavy lifting here.
| :--- |
| ![][installation-steps] |
|
Installation, Usage, & Testing
- Setup instructions and usage guides are automatically created based on data extracted from the codebase.
| ![][usage-guides] |
|
Community & Support
- Development roadmap, contribution guidelines, license information, and community resources.
- A return button is also included for easy navigation.
| :--- |
| ![][community-and-support] |
|
Contribution Guides
- Instructions for contributing to the project, including resource links and a basic contribution guide.
- Graph of contributors is also included for open-source projects.
| ![][contributing-guidelines] |
## Getting Started
### Prerequisites
ReadmeAI requires Python 3.9 or higher, and one of the following installation methods:
| Requirement | Details |
|--------------------------------------|----------------------------------|
| • [Python][python-link] ≥3.9 | Core runtime |
| **Installation Method** (choose one) | |
| • [pip][pip-link] | Default Python package manager |
| • [pipx][pipx-link] | Isolated environment installer |
| • [uv][uv-link] | High-performance package manager |
| • [docker][docker-link] | Containerized environment |
### Supported Repository Platforms
To generate a README file, provide the source repository. ReadmeAI supports these platforms:
| Platform | Details |
|----------------------------|---------------------------|
| [File System][file-system] | Local repository access |
| [GitHub][github] | Industry-standard hosting |
| [GitLab][gitlab] | Full DevOps integration |
| [Bitbucket][bitbucket] | Atlassian ecosystem |
### Supported LLM API Services
ReadmeAI is model agnostic, with support for the following LLM API services:
| Provider | Best For | Details |
|-----------------------------------|-----------------|--------------------------|
| [OpenAI][openai] | General use | Industry-leading models |
| [Anthropic][anthropic] | Advanced tasks | Claude language models |
| [Google Gemini][gemini] | Multimodal AI | Latest Google technology |
| [Ollama][ollama] | Open source | No API key needed |
| [Offline Mode][README-Offline.md] | Local operation | No internet required |
---
### Installation
ReadmeAI is available on [PyPI][pypi-link] as readmeai and can be installed as follows:
####  Pip
Install with pip (recommended for most users):
```sh
❯ pip install -U readmeai
```
####  Pipx
With `pipx`, readmeai will be installed in an isolated environment:
```sh
❯ pipx install readmeai
```
####  Uv
The fastest way to install readmeai is with [uv][uv-link]:
```sh
❯ uv tool install readmeai
```
####  Docker
To run `readmeai` in a containerized environment, pull the latest image from [Docker Hub][dockerhub-link]:
```sh
❯ docker pull zeroxeli/readme-ai:latest
```
####  From source
Click to build readmeai
from source
1. **Clone the repository:**
```sh
❯ git clone https://github.com/eli64s/readme-ai
```
2. **Navigate to the project directory:**
```sh
❯ cd readme-ai
```
3. **Install dependencies:**
```sh
❯ pip install -r setup/requirements.txt
```
Alternatively, use the [setup script][setup-script] to install dependencies:
#####  Bash
1. **Run the setup script:**
```sh
❯ bash setup/setup.sh
```
Or, use `poetry` to build and install project dependencies:
#####  Poetry
1. **Install dependencies with poetry:**
```sh
❯ poetry install
```
### Additional Optional Dependencies
> [!IMPORTANT]
> To use the **Anthropic** and **Google Gemini** clients, extra dependencies are required. Install the package with the following extras:
>
> - **Anthropic:**
> ```sh
> ❯ pip install "readmeai[anthropic]"
> ```
> - **Google Gemini:**
> ```sh
> ❯ pip install "readmeai[google-generativeai]"
> ```
>
> - **Install Multiple Clients:**
> ```sh
> ❯ pip install "readmeai[anthropic,google-generativeai]"
> ```
### Usage
#### Set your API key
When running `readmeai` with a third-party service, you must provide a valid API key. For example, the `OpenAI` client is set as follows:
```sh
❯ export OPENAI_API_KEY=
# For Windows users:
❯ set OPENAI_API_KEY=
```
Click to view environment variables for - Ollama
, Anthropic
, Google Gemini
Ollama
Refer to the [Ollama documentation][ollama] for more information on setting up the Ollama server.
To start, follow these steps:
1. Pull your model of choice from the Ollama repository:
```sh
❯ ollama pull llama3.2:latest
```
2. Start the Ollama server and set the `OLLAMA_HOST` environment variable:
```sh
❯ export OLLAMA_HOST=127.0.0.1 && ollama serve
```
Anthropic
1. Export your Anthropic API key:
```sh
❯ export ANTHROPIC_API_KEY=
```
Google Gemini
1. Export your Google Gemini API key:
```sh
❯ export GOOGLE_API_KEY=
#### Using the CLI
##### Running with a LLM API service
Below is the minimal command required to run `readmeai` using the `OpenAI` client:
```sh
❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai
```
> [!IMPORTANT]
> The default model set is `gpt-3.5-turbo`, offering the best balance between cost and performance.When using any model from the `gpt-4` series and up, please monitor your costs and usage to avoid unexpected charges.
ReadmeAI can easily switch between API providers and models. We can run the same command as above with the `Anthropic` client:
```sh
❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai
```
And finally, with the `Google Gemini` client:
```sh
❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
```
##### Running with local models
We can also run `readmeai` with free and open-source locally hosted models using the Ollama:
```sh
❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
```
##### Running on a local codebase
To generate a README file from a local codebase, simply provide the full path to the project:
```sh
❯ readmeai --repository /users/username/projects/myproject --api openai
```
Adding more customization options:
```sh
❯ readmeai --repository https://github.com/eli64s/readme-ai \
--output readmeai.md \
--api openai \
--model gpt-4 \
--badge-color A931EC \
--badge-style flat-square \
--header-style compact \
--navigation-style fold \
--temperature 0.9 \
--tree-depth 2
--logo LLM \
--emojis solar
```
##### Running in offline mode
ReadmeAI supports `offline mode`, allowing you to generate README files without using a LLM API service.
```sh
❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai
```
####  Docker
Run the `readmeai` CLI in a Docker container:
```sh
❯ docker run -it --rm \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
--repository https://github.com/eli64s/readme-ai \
--api openai
```
####  Streamlit
Try readme-ai directly in your browser on Streamlit Cloud, no installation required.
[](https://readme-ai.streamlit.app/)
See the [readme-ai-streamlit][readme-ai-streamlit] repository on GitHub for more details about the application.
> [!WARNING]
> The readme-ai Streamlit web app may not always be up-to-date with the latest features. Please use the command-line interface (CLI) for the most recent functionality.
####  From source
Click to run readmeai
from source
#####  Bash
If you installed the project from source with the bash script, run the following command:
1. Activate the virtual environment:
```sh
❯ conda activate readmeai
```
2. Run the CLI:
```sh
❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
```
#####  Poetry
1. Activate the virtual environment:
```sh
❯ poetry shell
```
2. Run the CLI:
```sh
❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
```
### Testing
The [pytest][pytest-link] and [nox][nox-link] frameworks are used for development and testing.
Install the dependencies with uv:
```sh
❯ uv pip install --dev --group test --all-extras
```
Run the unit test suite using Pytest:
```sh
❯ make test
```
Using nox, test the app against Python versions `3.9`, `3.10`, `3.11`, and `3.12`:
```sh
❯ make test-nox
```
> [!TIP]
> Nox is an automation tool for testing applications in multiple environments. This helps ensure your project is compatible with across Python versions and environments.
## Configuration
Customize your README generation with a variety of options and style settings supported such as:
| Option | Description | Default |
|----------------------|-----------------------------------------------|-----------------|
| `--align` | Text alignment in header | `center` |
| `--api` | LLM API service provider | `offline` |
| `--badge-color` | Badge color name or hex code | `0080ff` |
| `--badge-style` | Badge icon style type | `flat` |
| `--header-style` | Header template style | `classic` |
| `--navigation-style` | Table of contents style | `bullet` |
| `--emojis` | Emoji theme packs prefixed to section titles | `None` |
| `--logo` | Project logo image | `blue` |
| `--logo-size` | Logo image size | `30%` |
| `--model` | Specific LLM model to use | `gpt-3.5-turbo` |
| `--output` | Output filename | `readme-ai.md` |
| `--repository` | Repository URL or local directory path | `None` |
| `--temperature` | Creativity level for content generation | `0.1` |
| `--tree-max-depth` | Maximum depth of the directory tree structure | `2` |
Run the following command to view all available options:
```sh
❯ readmeai --help
```
Visit the [Official Documentation][docs] for a complete guide on configuring and customizing README files.
## Example Gallery
This gallery showcases a diverse collection of README examples generated across various programming languages, frameworks, and project types.
| Tech | Repository | README | Project Description |
|:--------------------|:------------------------|:-----------------------|:---------------------------|
| Python | [README-Python.md] | [readmeai] | ReadmeAI's core project |
| Apache Flink | [README-Flink.md] | [pyflink-poc] | PyFlink proof of concept |
| Streamlit | [README-Streamlit.md] | [readmeai-streamlit] | Web application interface |
| Vercel & NPM | [README-Vercel.md] | [github-readme-quotes] | Deployment showcase |
| Go & Docker | [README-DockerGo.md] | [docker-gs-ping] | Containerized Golang app |
| FastAPI & Redis | [README-FastAPI.md] | [async-ml-inference] | ML inference service |
| Java | [README-Java.md] | [minimal-todo] | Minimalist To-Do app |
| PostgreSQL & DuckDB | [README-PostgreSQL.md] | [buenavista] | Database proxy server |
| Kotlin | [README-Kotlin.md] | [android-client] | Mobile client application |
| Offline Mode | [README-Offline.md] | [litellm] | Offline functionality demo |
### Community Contribution
#### Share Your README Files
We invite developers to share their generated README files in our [Show & Tell][show-and-tell] discussion category. Your contributions help:
- Showcase diverse documentation styles
- Provide real-world examples
- Help improve the ReadmeAI tool
Find additional README examples in our [examples directory][examples-directory] on GitHub.
## Roadmap
* [ ] Release `readmeai 1.0.0` with robust documentation creation and maintenance capabilities.
* [ ] Extend template support for various `project types` and `programming languages`.
* [ ] Develop `Vscode Extension` to generate README files directly in the editor.
* [ ] Develop `GitHub Actions` to automate documentation updates.
* [ ] Add `badge packs` to provide additional badge styles and options.
+ [ ] Code coverage, CI/CD status, project version, and more.
## Contributing
Contributions are welcome! Please read the [Contributing Guide][contributing] to get started.
- **💡 [Contributing Guide][contributing]**: Learn about our contribution process and coding standards.
- **🐛 [Report an Issue][github-issues]**: Found a bug? Let us know!
- **💬 [Start a Discussion][github-discussions]**: Have ideas or suggestions? We'd love to hear from you.
## Acknowledgments
A big shoutout to the projects below for their awesome work and open-source contributions:
## 🎗 License
Copyright © 2023-2025 [readme-ai][readme-ai].
Released under the [MIT][license] license.
[![][to-the-top]](#top)
[readme-ai]: https://github.com/eli64s/readme-ai
[readme-ai-streamlit]: https://github.com/eli64s/readme-ai-streamlit
[actions]: https://github.com/eli64s/readme-ai/actions
[codecov]: https://app.codecov.io/gh/eli64s/readme-ai
[docs]: https://eli64s.github.io/readme-ai
[github-discussions]: https://github.com/eli64s/readme-ai/discussions
[github-issues]: https://github.com/eli64s/readme-ai/issues
[github-pulls]: https://github.com/eli64s/readme-ai/pulls
[mit]: https://opensource.org/license/mit
[pepy]: https://www.pepy.tech/projects/readmeai
[contributing]: https://github.com/eli64s/readme-ai/blob/main/CONTRIBUTING.md
[license]: https://github.com/eli64s/readme-ai/blob/main/LICENSE
[to-the-top]: https://img.shields.io/badge/Return-5D4ED3?style=flat&logo=ReadMe&logoColor=white
[cli-demo]: https://github.com/user-attachments/assets/e1198922-5233-4a44-a5a8-15fa1cc4e2d7
[streamlit-demo]: https://github.com/user-attachments/assets/c3f60665-4768-4baa-8e31-6b6e8c4c9248
[docker-shield]: https://img.shields.io/badge/Docker-2496ED.svg?style=flat&logo=Docker&logoColor=white
[docker-link]: https://hub.docker.com/r/zeroxeli/readme-ai
[python-link]: https://www.python.org/
[pip-link]: https://pip.pypa.io/en/stable/
[pypi-shield]: https://img.shields.io/badge/PyPI-3775A9.svg?style=flat&logo=PyPI&logoColor=white
[pypi-link]: https://pypi.org/project/readmeai/
[pipx-shield]: https://img.shields.io/badge/pipx-2CFFAA.svg?style=flat&logo=pipx&logoColor=black
[pipx-link]: https://pipx.pypa.io/stable/
[uv-link]: https://docs.astral.sh/uv/
[pytest-shield]: https://img.shields.io/badge/Pytest-0A9EDC.svg?style=flat&logo=Pytest&logoColor=white
[pytest-link]: https://docs.pytest.org/en/7.1.x/contents.html
[nox-link]: https://nox.thea.codes/en/stable/
[streamlit-link]: https://readme-ai.streamlit.app/
[shieldsio]: https://shields.io/
[simple-icons]: https://simpleicons.org/
[skill-icons]: https://github.com/tandpfun/skill-icons
[github-profile-badges]: https://github.com/Aveek-Saha/GitHub-Profile-Badges
[markdown-badges]: https://github.com/Ileriayo/markdown-badges
[css-icons]: https://github.com/astrit/css.gg
[python-svg]: https://simpleicons.org/icons/python.svg
[pipx-svg]: https://simpleicons.org/icons/pipx.svg
[uv-svg]: https://simpleicons.org/icons/astral.svg
[docker-svg]: https://simpleicons.org/icons/docker.svg
[git-svg]: https://simpleicons.org/icons/git.svg
[bash-svg]: https://simpleicons.org/icons/gnubash.svg
[poetry-svg]: https://simpleicons.org/icons/poetry.svg
[streamlit-svg]: https://simpleicons.org/icons/streamlit.svg
[file-system]: https://en.wikipedia.org/wiki/File_system
[github]: https://github.com/
[gitlab]: https://gitlab.com/
[bitbucket]: https://bitbucket.org/
[anthropic]: https://docs.anthropic.com/en/home
[gemini]: https://ai.google.dev/tutorials/python_quickstart
[ollama]: https://github.com/ollama/ollama
[openai]: https://platform.openai.com/docs/quickstart/account-setup:
[project-overview]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/project-overview/introduction.png?raw=true
[features-table]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/features/features.png?raw=true
[project-structure]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/project-structure/project-structure.png?raw=true
[project-index]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/project-structure/project-index.png?raw=true
[installation-steps]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/getting-started/installation-steps.png?raw=true
[usage-guides]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/getting-started/usage-guides.png?raw=true
[community-and-support]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/community/community-and-support.png?raw=true
[contributing-guidelines]: https://github.com/eli64s/readme-ai/blob/main/docs/docs/assets/img/community/contributing-guidelines.png?raw=true
[readmeai.parsers]: https://github.com/eli64s/readme-ai/tree/main/readmeai/parsers
[tree.py]: https://github.com/eli64s/readme-ai/blob/main/readmeai/generators/tree.py
[prompts.toml]: https://github.com/eli64s/readme-ai/blob/main/readmeai/config/settings/prompts.toml
[readmeai]: https://github.com/eli64s/readme-ai
[pyflink-poc]: https://github.com/eli64s/pyflink-poc
[readmeai-streamlit]: https://github.com/eli64s/readme-ai-streamlit
[github-readme-quotes]: https://github.com/PiyushSuthar/github-readme-quotes
[docker-gs-ping]: https://github.com/olliefr/docker-gs-ping
[async-ml-inference]: https://github.com/FerrariDG/async-ml-inference
[minimal-todo]: https://github.com/avjinder/Minimal-Todo
[buenavista]: https://github.com/jwills/buenavista
[android-client]: https://github.com/rumaan/file.io-Android-Client
[litellm]: https://github.com/BerriAI/litellm
[README-Python.md]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-ai.md
[README-Flink.md]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/modern.md
[README-Streamlit.md]: https://github.com/eli64s/readme-ai/blob/main/examples/banners/svg-banner.md
[README-Vercel.md]: https://github.com/eli64s/readme-ai/blob/main/examples/logos/dalle.md
[README-DockerGo.md]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-docker-go.md
[README-FastAPI.md]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-fastapi-redis.md
[README-Java.md]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/compact.md
[README-PostgreSQL.md]: https://github.com/eli64s/readme-ai/blob/main/examples/headers/classic.md
[README-Kotlin.md]: https://github.com/eli64s/readme-ai/blob/main/examples/readme-kotlin.md
[README-Offline.md]: https://github.com/eli64s/readme-ai/blob/main/examples/offline-mode/readme-litellm.md
[examples-directory]: https://github.com/eli64s/readme-ai/tree/main/examples
[show-and-tell]: https://github.com/eli64s/readme-ai/discussions/categories/show-and-tell