Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/brycecanyoncounty/bcc-ai
A RedM standalone Development API System exposing popular AI (LLM) systems to developers. Supports: ChatGPT, Google Gemini, Mistral AI, Horde AI
https://github.com/brycecanyoncounty/bcc-ai
ai-horde cfx chatgpt gemini-ai lua mistralai redm
Last synced: 27 days ago
JSON representation
A RedM standalone Development API System exposing popular AI (LLM) systems to developers. Supports: ChatGPT, Google Gemini, Mistral AI, Horde AI
- Host: GitHub
- URL: https://github.com/brycecanyoncounty/bcc-ai
- Owner: BryceCanyonCounty
- License: gpl-3.0
- Created: 2023-11-08T08:40:07.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-12T07:03:39.000Z (about 2 months ago)
- Last Synced: 2024-11-12T07:34:41.711Z (about 2 months ago)
- Topics: ai-horde, cfx, chatgpt, gemini-ai, lua, mistralai, redm
- Language: JavaScript
- Homepage: https://bcc-scripts.com/
- Size: 158 KB
- Stars: 2
- Watchers: 3
- Forks: 3
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# BCC AI
> A RedM standalone Development API System exposing popular AI (LLM) systems to developers. Supports: ChatGPT, Google Gemini, Mistral AI, Horde AI
**This is an experimental project, use at your own discretion.**
## How to install
- Download this repo
- Copy and paste `bcc-ai` folder to `resources/bcc-ai`
- Add `ensure bcc-ai` to your `server.cfg` file (ABOVE any scripts that use it)
- Now you are ready to get coding!## API Docs
### Generate text based on a prompt with ChatGPT
```lua
local AI = exports['bcc-ai'].Initiate('gpt', 'YOUR_KEY_HERE')
local test = AI.generateText({
prompt = "Hello what is your name?",
model = "gpt-3.5-turbo",
max_tokens = 40,
temperature = 1
})print(test[1].text)
```### Generate text based on a prompt with Google Gemini
```lua
local AI = exports['bcc-ai'].Initiate('gemini', 'YOUR_KEY_HERE')local prompt = [[List a few popular cookie recipes using this JSON schema:
Recipe = {'recipeName': string}
Return: Array]]local test = AI.generateText({
prompt = prompt,
model = "gemini-1.5-flash"
})print(test[1].recipeName)
```### Generate text based on a prompt with Mistral AI
```lua
local AI = exports['bcc-ai'].Initiate('mistral', 'YOUR_KEY_HERE')
local test = AI.generateText({
model = "mistral-large-latest",
messages= {
{"role": "user", "content": "Who is the most renowned French painter?"}
}
})print(test.choices[0].message.content)
```### Generate text based on a prompt with Horde
```lua
local AI = exports['bcc-ai'].Initiate()
local test = AI.generateText('Hello what is your name?')
print(test[1].text)-- Example response: Why so nervous? I'm not here to harm you." Another step closer. "My name is Sma, by the way. And you are?"
```## Credits
### [AI Horde](https://aihorde.net)
The more you use this without contributing by [joining the horde](https://github.com/Haidra-Org/AI-Horde/blob/main/README_StableHorde.md#joining-the-horde), the lower your request is in any priority processing queue. However, this helps it remain completely free. If you wish to help support this free to use system, join the horde and host an LLM. This will increase your queue status.