Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/kastorcode/ollama-gui-reactjs

Frontend for the Ollama LLM, built with React.js and Flux architecture.
https://github.com/kastorcode/ollama-gui-reactjs

craco flux flux-architecture flux-pattern front-end frontend github-pages kastor-code kastorcode kastorcoder llama3 ollama-client ollama-gui ollama-ui react-router react-router-dom react-transition-group reactjs styled-components typescript

Last synced: 26 days ago
JSON representation

Frontend for the Ollama LLM, built with React.js and Flux architecture.

Awesome Lists containing this project

README

        

## Ollama LLM Graphical User Interface

> 👨‍đŸ’ģ Developed by Matheus Ramalho de Oliveira
🏗ī¸ Brazilian Software Engineer
✉ī¸ [email protected]
đŸĻĢ [LinkedIn](https://br.linkedin.com/in/kastorcode) â€ĸ [Instagram](https://instagram.com/kastorcode)

---




This application is a frontend for the LLM (large language model) Ollama. Ollama is an interface created by Meta that facilitates the use of artificial intelligence.

---

### Screenshots








---

### Technologies
[Craco](https://craco.js.org)
[Flux Architecture](https://facebookarchive.github.io/flux)
[React.js](https://react.dev)
[React Hooks Global State](https://npmjs.com/package/react-hooks-global-state)
[React Router](https://reactrouter.com)
[React Transition Group](https://reactcommunity.org/react-transition-group)
[Styled Components](https://styled-components.com)
[TypeScript](https://typescriptlang.org)

---

### Installation and execution

1. You need to have the [Ollama server](https://ollama.com/download) installed on your machine, or configure the app to use an external URL;
2. Make a clone of this repository;
3. Open the project folder in a terminal;
4. Run `yarn` to install dependencies;
5. Run `yarn start` to launch at `http://localhost:3000`.

---

### Running from GitHub Pages

1. You need to have the [Ollama server](https://ollama.com/download) installed on your machine, or configure the app to use an external URL;
2. By default, the app uses the llama3 model, you can install it with the command: `ollama run llama3`;
3. If you have the local server, run it with the following command releasing CORS: `export OLLAMA_ORIGINS=https://*.github.io && ollama serve`;
4. Access at: [kastorcode.github.io/ollama-gui-reactjs](https://kastorcode.github.io/ollama-gui-reactjs).

---


<kastor.code/>