Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/seal-io/appilot
Chat to deploy and manage applications on any infrastructure
https://github.com/seal-io/appilot
Last synced: about 1 month ago
JSON representation
Chat to deploy and manage applications on any infrastructure
- Host: GitHub
- URL: https://github.com/seal-io/appilot
- Owner: seal-io
- License: apache-2.0
- Created: 2023-09-19T12:58:01.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-11-06T09:59:45.000Z (about 1 year ago)
- Last Synced: 2024-08-13T07:08:06.081Z (4 months ago)
- Language: Python
- Homepage:
- Size: 133 KB
- Stars: 124
- Watchers: 5
- Forks: 20
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- jimsghstars - seal-io/appilot - Chat to deploy and manage applications on any infrastructure (Python)
README
# Appilot
Appilot['æpaɪlət] stands for application-pilot.
It is an experimental project that helps you operate applications using GPT-like LLMs.## Feature
- Application management: deploy, upgrade, rollback, etc.
- Environment management: clone, view topology, etc.
- Diagnose: view logs, find flaws and provide fixes.
- Safeguard: any action involving state changes requires human approval.
- Hybrid infrastructure: works on kubernetes, VM, cloud, on-prem.
- Multi language support: Choose the natural language you're comfortable with.
- Pluggable backends: It supports multiple backends including [Walrus](https://github.com/seal-io/walrus) and [Kubernetes](https://kubernetes.io), and is extensible.## Demo
Chat to deploy llama-2 on AWS:
https://github.com/seal-io/appilot/assets/5697937/0562fe29-8e97-42ba-bbf6-eaa5b5fefc41
Other use cases:
- [Deploy from source code](./examples/walrus_deploy_source_code.md)
- [Manage environments](./examples/walrus_manage_environment.md)
- [Manage applications in Kubernetes using helm charts](./examples/k8s_helm.md)
- [Operating native Kubernetes resources](./examples/k8s_yaml.md)
- [Diagnose and fix issues](./examples/k8s_diagnose.md)## Quickstart
> **prerequistes:**
>
> - Get OpenAI API key with access to the gpt-4 model.
> - Install `python3` and `make`.
> - Install [kubectl](https://kubernetes.io/docs/tasks/tools/) and [helm](https://helm.sh/docs/intro/install/).
> - Have a running Kubernetes cluster.1. Clone the repository.
```
git clone https://github.com/seal-io/appilot && cd appilot
```2. Run the following command to get the envfile.
```
cp .env.example .env
```3. Edit the `.env` file and fill in `OPENAI_API_KEY`.
4. Run the following command to install. It will create a venv and install required dependencies.
```
make install
```5. Run the following command to get started:
```
make run
```6. Ask Appilot to deploy an application, e.g.:
```
> Deploy a jupyterhub.
...
> Get url of the jupyterhub.
```## Usage
### Configuration
Appilot is configurable via environment variable or the envfile:
| Parameter | Description | Default |
|----------|------|---------------|
| OPENAI_API_KEY | OpenAI API key, access to gpt-4 model is required. | "" |
| OPENAI_API_BASE | Custom openAI API base. You can integrate with other LLMs as long as they serve in the same API style. | "" |
| TOOLKITS | Toolkits to enable. Currently support Kubernetes and Walrus. Case insensitive. | "kubernetes" |
| NATURAL_LANGUAGE | Natural language AI used to interacte with you. e.g., Chinese, Japanese, etc. | "English" |
| SHOW_REASONING | Show AI reasoning steps. | True |
| VERBOSE | Output in verbose mode. | False |
| WALRUS_URL | URL of Walrus, valid when Walrus toolkit is enabled. | "" |
| WALRUS_API_KEY | API key of Walrus, valid when Walrus toolkit is enabled. | "" |
| WALRUS_SKIP_TLS_VERIFY | Skip TLS verification for WALRUS API. Use when testing with self-signed certificates. Valid when Walrus toolkit is enabled. | True |
| WALRUS_DEFAULT_PROJECT | Project name for the default context, valid when Walrus toolkit is enabled. | "" |
| WALRUS_DEFAULT_ENVIRONMENT | Environment name for the default context, valid when Walrus toolkit is enabled. | "" |### Using Kubernetes Backend
Follow steps in quickstart to run with Kubernetes backend.
### Using Walrus Backend
> **Prerequisites:** [Install Walrus](https://seal-io.github.io/docs/quickstart).
Walrus serves as the application management engine. It provides features like hybrid infrastructure support, environment management, etc.
To enable Walrus backend, edit the envfile:1. Set `TOOLKITS=walrus`
2. Fill in `OPENAI_API_KEY`, `WALRUS_URL` and `WALRUS_API_KEY`Then you can run Appilot to get started:
```
make run
```### Run with Docker
You can run Appilot in docker container when using Walrus backend.
> **Prerequisites:** Install `docker`.
1. Get an envfile by running the following command.
```
cp .env.example .env
```2. Configure the `.env` file.
- Set `TOOLKITS=walrus`
- Fill in `OPENAI_API_KEY`, `WALRUS_URL` and `WALRUS_API_KEY`3. Run the following command:
```
docker run -it --env-file .env sealio/appilot:main
```### Using LLM alternatives to GPT-4
You can use other LLMs as the reasoning engine of Appilot, as long as it serves inference APIs in openAI compatible way.
1. Configure the `.env` file, then set `OPENAI_API_BASE=https://your-api-base`.
2. Run Appilot as normal.
## How it works
The following is the architecture diagram of Appilot:
![appilot-arch](https://github.com/seal-io/appilot/assets/5697937/914cb60d-60ab-4b4d-8661-82f89d85683b)
## License
Copyright (c) 2023 [Seal, Inc.](https://seal.io)
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at [LICENSE](./LICENSE) file for details.Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.