Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jose-donato/ollama-reply
open-source browser extension that leverages the power of the AI to generate engaging replies for social media growth.
https://github.com/jose-donato/ollama-reply
browser-extension llama3 ollama tailwindcss
Last synced: about 1 month ago
JSON representation
open-source browser extension that leverages the power of the AI to generate engaging replies for social media growth.
- Host: GitHub
- URL: https://github.com/jose-donato/ollama-reply
- Owner: jose-donato
- License: mit
- Created: 2024-05-02T22:31:18.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-05-03T15:15:17.000Z (6 months ago)
- Last Synced: 2024-10-10T06:04:32.893Z (about 1 month ago)
- Topics: browser-extension, llama3, ollama, tailwindcss
- Language: TypeScript
- Homepage:
- Size: 347 KB
- Stars: 218
- Watchers: 3
- Forks: 31
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- jimsghstars - jose-donato/ollama-reply - open-source browser extension that leverages the power of the AI to generate engaging replies for social media growth. (TypeScript)
README
# ollama-reply
> **Disclaimer**: While I generally do not advocate for the use of automated tools for social media interactions, as they can reduce genuine human engagement, I wanted to build ollama-reply as an experiment. It was an opportunity to explore the capabilities of Ollama and dive into browser extensions.
ollama-reply is an open-source browser extension that leverages the power of the Ollama Llama3 model to generate engaging replies for social media growth. This tool is designed as a free and open alternative to [MagicReply](https://magicreply.io/).
## Features
- **Open Source** 📖: Freely modify and distribute the code.
- **Powerful AI** 🧠: Uses the Ollama Llama3 model for generating replies.
- **Browser Extension** 🌐: Easy to use directly in your chromium browser.
- **Customizable** ⚙️: Configurable to use any ollama model and adapt the answers to your needs
- **Free** 💸: No cost to download and use the extension.## Technologies
ollama-reply is a react-based browser extension built using the following technologies:
- [React](https://reactjs.org/)
- [Vite](https://vitejs.dev/)
- [TypeScript](https://www.typescriptlang.org/)
- [Tailwind CSS](https://tailwindcss.com/)
- [shadcn-ui](https://shadcn-ui.com/)
- [Ollama](https://ollama.com/)
- [@samrum/vite-plugin-web-extension](https://github.com/samrum/vite-plugin-web-extension)## Getting Started
### Prerequisites
Before you begin, ensure you have:
- A Chromium based Browser
- Ollama installed - download [here](https://ollama.com/)### Installation Steps
1. **Pull the AI Model**:
- Use the command `ollama pull llama3:8b` to download the Llama3 model. You can also use other models if you prefer. See [Configuration](#configuration) for more information.1. **Start Ollama Server**:
- ⚠️ Ensure you set the environment variable `OLLAMA_ORIGINS=* ollama serve` to allow calls from the browser extension.3. **Download the Repository**:
- Download the ollama-reply repository from GitHub.4. **Unzip the Repository**:
- Unzip the downloaded repository to a desired location on your computer.5. **Load the Extension**:
- Open Google Chrome (or other chromium browser) and navigate to `chrome://extensions/`.
- Enable Developer Mode by toggling the switch at the top-right.
- Click on "Load unpacked" and select the `dist` folder inside the unzipped folder of this repository.## Usage
Once installed, navigate to any post on Twitter or LinkedIn, and you will see an additional button labeled "Generate Reply". Clicking this button will use the Ollama Llama3 model to generate a contextually relevant reply.
## Configuration
### Model Selection
By default the extension uses the `llama3:8b` model. You can change this by pulling the model you want from ollama and updating the `MODEL` variable in the `src/entries/background/main.ts` file. You will need to rebuild the extension after changing the model.
### Prompt Configuration
You can configure the answers generated by the extension by updating the `SYSTEM_PROMPT` variable in the `src/entries/background/main.ts` file. This variable is used as the prompt for the Ollama model. You will need to rebuild the extension after changing the prompt.
## Development
This project uses [@samrum/vite-plugin-web-extension](https://github.com/samrum/vite-plugin-web-extension). Refer to the plugin documentation for more information.
### Project Setup
```sh
npm install
```### Commands
Hot Module Reloading is used to load changes inline without requiring extension rebuilds and extension/page reloads
Currently only works in Chromium based browsers.
```sh
npm run dev
```#### Development, Watch
Rebuilds extension on file changes. Requires a reload of the extension (and page reload if using content scripts)
```sh
npm run watch
```#### Production
Minifies and optimizes extension build
```sh
npm run build
```### Load extension in browser
Loads the contents of the dist directory into the specified browser
```sh
npm run serve:chrome
``````sh
npm run serve:firefox
```## Contributing
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**.
- Fork the Project
- Create your Feature Branch (`git checkout -b feature/AmazingFeature`)
- Commit your Changes (`git commit -m 'Add some AmazingFeature'`)
- Push to the Branch (`git push origin feature/AmazingFeature`)
- Open a Pull Request## License
Distributed under the MIT License. See `LICENSE` for more information.
## Acknowledgments
- Inspired by [MagicReply](https://magicreply.io/)
- [@samrum/vite-plugin-web-extension](https://github.com/samrum/vite-plugin-web-extension)
- [Ollama](https://ollama.com/)