Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/yankeexe/llm-function-calling-demo
LLM Function Calling Demo for YouTube video π₯
https://github.com/yankeexe/llm-function-calling-demo
ai langchain langchain-app langchain-python large-language-models llm ollama ollama-app streamlit streamlit-webapp
Last synced: about 1 month ago
JSON representation
LLM Function Calling Demo for YouTube video π₯
- Host: GitHub
- URL: https://github.com/yankeexe/llm-function-calling-demo
- Owner: yankeexe
- Created: 2024-11-07T14:10:47.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2024-11-07T20:56:23.000Z (3 months ago)
- Last Synced: 2024-12-02T18:01:25.234Z (2 months ago)
- Topics: ai, langchain, langchain-app, langchain-python, large-language-models, llm, ollama, ollama-app, streamlit, streamlit-webapp
- Language: Python
- Homepage: https://youtu.be/1Wen70lzX-8
- Size: 9.77 KB
- Stars: 2
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# π Function Calling Demo Application
Demo function calling app for the YouTube video.
Watch the video π
## π¨ Setting up locally
Create virtualenv and install dependencies.
This step is not required if you are running in docker.
```sh
make setup
```## β‘οΈ Running the application
Make sure you have [Ollama](https://ollama.com/download) installed and running on your machine.
By default, the app uses [mistral-nemo](https://ollama.com/library/mistral-nemo) model but you can use [Llama3.1](https://ollama.com/library/llama3.1) or [Llama3.2](https://ollama.com/library/llama3.2).
Download these models before running the application. Update [app.py](https://github.com/yankeexe/llm-function-calling-demo/blob/55b73c6947f05d460f284d92136285b4e1d233bd/app.py#L66) to change the model if necessary.
### Running locally
```sh
make run
```### Running in a container
```sh
make run-docker
```β οΈ Does not work with Linux π§
Application running inside of the container uses a special DNS name `host.docker.internal` to communicate with Ollama running on the host machine.
However, this DNS name is not resolvable in Linux.
## β¨ Linters and Formatters
Check for linting rule violations:
```sh
make check
```Auto-fix linting violations:
```sh
make fix
```## π€ΈββοΈ Getting Help
```sh
make# OR
make help
```