https://github.com/open-kbs/ai-cloud-master
An AI-driven cloud assistant executes AWS-related tasks using Node.js
https://github.com/open-kbs/ai-cloud-master
ai ai-agents aws cloudagent gpt-4 open-source openkbs
Last synced: 3 months ago
JSON representation
An AI-driven cloud assistant executes AWS-related tasks using Node.js
- Host: GitHub
- URL: https://github.com/open-kbs/ai-cloud-master
- Owner: open-kbs
- License: mit
- Created: 2024-09-15T13:14:24.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-10-22T10:58:53.000Z (12 months ago)
- Last Synced: 2024-10-23T08:04:26.230Z (12 months ago)
- Topics: ai, ai-agents, aws, cloudagent, gpt-4, open-source, openkbs
- Language: JavaScript
- Homepage: https://openkbs.com/apps/
- Size: 3.58 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# AI Cloud Master · [](https://github.com/open-kbs/ai-cloud-master/blob/main/LICENSE)
![]()
An AI-driven cloud assistant executes AWS-related tasks using Node.js
## OpenKBS + AWS = Mobile Cloud Agent
![]()
![]()
![]()
![]()
![]()
![]()
![]()
## Installation Guide
### Step 1: Install OpenKBS CLI and Login
You need to have the OpenKBS CLI installed on your system and Login:
```bash
npm install -g openkbs
openkbs login
```If you do not have npm installed, you can check https://github.com/open-kbs/openkbs (Download Binary)
### Step 2: Clone the Repository
Clone the repository to your local machine:
```bash
git clone git@github.com:open-kbs/ai-cloud-master.git
cd ai-cloud-master
```### Step 3: Deploy the Application to the OpenKBS Cloud (to run our backend services locally - continue reading):
Deploy your application using the OpenKBS CLI:```bash
openkbs push
```Once the deployment is complete, you will receive a URL for your app: `https://{kbId}.apps.openkbs.com`.
Login to your KB and have fun!### Step 4: Running the Frontend Locally for development
Run the OpenKBS UI dev server locally:
```bash
npm i
npm start
```### Step 5: Running the Backend Locally
Run the Chat server locally:
```bash
npm run chat
```- Enter your `OPENAI_KEY` when prompted. This key will be stored at `~/.openkbs/.env`.
- From OpenKBS UI change the Chat model to GPT-* On-premises models### Step 6: Running the AI Services locally on your own GPU
To run this AI app on your own GPU with Llama 3.1 and Stable Diffusion 3, read more here [Install OpenKBS AI Server](https://github.com/open-kbs/openkbs?tab=readme-ov-file#installing-openkbs-ai-server-and-integrating-llama-31-and-stable-diffusion-3-locally)## Install via WEB
To install this app via our website visit [AI Cloud Master](https://openkbs.com/apps/ai-cloud-master/)## License
This project is licensed under the MIT License. For more details, please refer to the [LICENSE](https://github.com/open-kbs/ai-cloud-master/blob/main/LICENSE) file.