https://github.com/soodaayush/llamatalk
A chatbot powered by Meta's Codellama.
https://github.com/soodaayush/llamatalk
codellama css html javscript ollama scss
Last synced: about 1 year ago
JSON representation
A chatbot powered by Meta's Codellama.
- Host: GitHub
- URL: https://github.com/soodaayush/llamatalk
- Owner: soodaayush
- Created: 2024-02-01T21:39:50.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2025-03-08T23:57:09.000Z (about 1 year ago)
- Last Synced: 2025-03-09T00:24:28.674Z (about 1 year ago)
- Topics: codellama, css, html, javscript, ollama, scss
- Language: SCSS
- Homepage:
- Size: 66.4 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# LlamaTalk

## Inspiration
I wanted to create my own local chatbot with an interface. When I discovered that it was not difficult due to an easy-to-use service called Ollama, I decided to experiment with making a local chatbot website.
## Challenges
Connecting the Large Language Model (LLM) to the website itself and ensuring the correct objects were used within the response object.
## Lessons Learned
I learned about using the Ollama service and how to send and fetch data from the LLM.
## The Website
A chatbot powered by Meta's Codellama. You can ask it a question, and you will get a response. As of now, you must run the LLM locally on your machine through a service called Ollama. From there, your LLM will be ran on a localhost server, which you can define in the JavaScript file. LlamaTalk uses the localhost to provide a prompt and receive a response. You can customize the code to use any model with any parameter count as you please.
## Links
Ollama: https://ollama.com/