https://github.com/thiagobarbosa/documentry
AI-powered OpenAPI documentation generator for Next.js applications.
https://github.com/thiagobarbosa/documentry
ai api docs documentation llm nextjs openapi
Last synced: 5 months ago
JSON representation
AI-powered OpenAPI documentation generator for Next.js applications.
- Host: GitHub
- URL: https://github.com/thiagobarbosa/documentry
- Owner: thiagobarbosa
- Created: 2025-04-13T22:42:18.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-06-23T20:39:42.000Z (7 months ago)
- Last Synced: 2025-08-09T04:46:23.568Z (6 months ago)
- Topics: ai, api, docs, documentation, llm, nextjs, openapi
- Language: TypeScript
- Homepage:
- Size: 627 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Documentry
[](https://www.npmjs.com/package/documentry)
[](https://opensource.org/licenses/MIT)
Documentry is a AI-powered Typescript library that uses LLM models to understand your Next.js API routes and
automatically generate detailed OpenAPI documentation in multiple formats: `json`, `yaml`, and interactive `html`.
With a single terminal command, `Documentry` scans every API route in your Next.js project,
understand the actual code of your `route.ts` files, and generates a valid `OpenAPI Specification (OAS)` file that
describes your endpoints.
## Key Features
- 🚀 Automatically scams your project and detects your Next.js API routes
- 🧠 Uses AI to understand the actual code of your routes
- 📝 Creates `OpenAPI 3.0` specifications in `json`, `yaml`, or interactive `html` format
- 🔄 Currently supports OpenAI and Anthropic models
## Installation
```bash
npm install documentry --save-dev
```
## Usage
### Command line
```bash
npx documentry
```
### Programmatic API
```typescript
import { Documentry } from 'documentry'
// Create a new Documentry instance
const documentry = new Documentry()
// Generate OpenAPI specs
await documentry.generate()
```
Full usage example
```typescript
const documentry = new Documentry({
provider: 'anthropic',
model: 'claude-3-5-sonnet-latest',
apiKey: process.env.ANTHROPIC_API_KEY,
dir: './app/api',
routes: ['/user', '/products/*'],
outputFile: './docs/openapi',
format: 'html', // 'yaml', 'json', or 'html'
info: {
title: 'My API',
version: '1.0.0',
description: 'My API description'
},
servers: [
{
url: 'http://localhost:3000/api',
description: 'Local server'
},
{
url: 'https://api.example.com',
description: 'Production server'
}
]
})
await documentry.generate()
```
An example of the generated OpenAPI documentation in HTML format:

More examples can be found in the [examples](src/examples) directory.
## Environment Variables
You can configure the LLM settings with an `.env` file:
```bash
LLM_PROVIDER=your-llm-provider # openai or anthropic; defaults to anthropic
LLM_MODEL=your-llm-model # defaults to claude-3-5-sonnet-latest
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key
```
## Configuration Options
The CLI usage supports the following options:
| Flag | Description | Default |
|-----------------------------------|-------------------------------------------------------------------------------|---------------------------------------------------------------------------------|
| `--dir ` | Root directory for your Nextjs API routes (`./app/api`, `./src/app/api`, etc) | `./app/api` |
| `--routes ` | List of routes to process (e.g., "/user,/products/*") | All routes are considered |
| `--servers ` | List of server URLs (e.g. "url1\|description1, url2...") | URL: `http://localhost:3000/api` |
| `-o, --output-file ` | Output folder/file for the generated OpenAPI specs | `./docs/openapi` |
| `-f, --format` | The format for the generated OpenAPI file (`yaml`, `json`, or `html`) | `yaml` |
| `-t, --title ` | Title for the OpenAPI spec | `Next.js API` |
| `-d, --description ` | Description for the OpenAPI spec | `API documentation for Next.js routes` |
| `-v, --version ` | Version for the OpenAPI spec | `1.0.0` |
| `-p, --provider ` | LLM provider (`anthropic` or `openai`) | Env variable `LLM_PROVIDER` |
| `-m, --model ` | LLM model to use | Env variable `LLM_MODEL` |
| `-k, --api-key ` | LLM provider API key | Env variable `ANTHROPIC_API_KEY` or `OPENAI_API_KEY`, according to the provider |
## Development
### Prerequisites
- Node.js >= 14.0.0
- npm >= 6.0.0
### Setting up the Development Environment
1. Clone the repository:
```bash
git clone https://github.com/thiagobarbosa/documentry
cd documentry
```
2. Install dependencies:
```bash
npm install
```
3. Build the project:
```bash
npm run build
```
4. Run in development mode:
```bash
npm run dev
```
## License
This project is licensed under the [MIT License](LICENSE).