Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pycui/universal-api-cohere
https://github.com/pycui/universal-api-cohere
Last synced: 1 day ago
JSON representation
- Host: GitHub
- URL: https://github.com/pycui/universal-api-cohere
- Owner: pycui
- License: mit
- Created: 2024-05-30T22:31:31.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-05-30T22:41:32.000Z (6 months ago)
- Last Synced: 2024-05-31T00:42:35.106Z (6 months ago)
- Language: Python
- Size: 8.79 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Universal API
*Don't know how to implement a function? Let Universal API help you on the fly!*
This is a non-serious library that can implement any functions on the fly using LLMs. To use it, simply call a function with a descriptive name and parameters, and it will be defined, implemented and called at runtime. Generated functions are cached so you don't pay for things that are already implemented.
(Note: this repo is based on my repo at https://github.com/pycui/universal-api and adopted to Cohere API.)
## Usage
```python
api = UniversalAPI()
# Start to call arbitrary functions
print(api.sort([3, 2, 1])) # returns [1, 2, 3]
print(api.sort([4, 3, 2, 1])) # returns [1, 2, 3, 4] using cached implementation
print(api.sort([1, 2, 3], reverse=True)) # returns [3, 2, 1]
print(api.add(1, 2)) # returns 3
print(api.reverse('hello')) # returns 'olleh'
api.fizzbuzz(15) # prints the fizzbuzz sequence up to 15
api.print_picachu() # prints an ASCII art of Picachu
```## Warning
This library will execute unverified code in your local machine. It's **NOT** safe to run in production (or really, any serious environment).## Notes
By default this library uses OpenAI GPT API. You can modify environment variables like `OPENAI_BASE_URL` to use other LLM endpoints (e.g. Anyscale Endpoint), which allows you to run on other models like the LLaMa series. It's also possible to use local LLMs.