Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/nebeyoumusie/reflection-70b-chat
This GitHub repository integrates Reflection Llama-3.1 70B, the world's top open-source large language model (LLM), in a Streamlit-based user interface. The LLM is trained using a new technique called Reflection-Tuning, which enables it to detect and correct mistakes in its own reasoning.
https://github.com/nebeyoumusie/reflection-70b-chat
langchain llama3 openrouter python streamlit streamlit-webapp
Last synced: 3 months ago
JSON representation
This GitHub repository integrates Reflection Llama-3.1 70B, the world's top open-source large language model (LLM), in a Streamlit-based user interface. The LLM is trained using a new technique called Reflection-Tuning, which enables it to detect and correct mistakes in its own reasoning.
- Host: GitHub
- URL: https://github.com/nebeyoumusie/reflection-70b-chat
- Owner: NebeyouMusie
- License: mit
- Created: 2024-09-09T10:29:09.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-09-28T17:26:08.000Z (3 months ago)
- Last Synced: 2024-10-12T18:01:45.645Z (3 months ago)
- Topics: langchain, llama3, openrouter, python, streamlit, streamlit-webapp
- Language: Python
- Homepage: https://reflection-70b.streamlit.app/
- Size: 31.3 KB
- Stars: 2
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Reflection 70B Chat
![Reflection 70B Chat UI Image](./images/reflection-ui.png)## Description
- This GitHub repository integrates `Reflection Llama-3.1 70B`, the world's top open-source large language model (LLM), in a Streamlit-based user interface. The LLM is trained using a new technique called Reflection-Tuning, which enables it to detect and correct mistakes in its own reasoning.## Libraries Used
- requests
- streamlit
- langchain
- python-dotenv
- langchain_community## File and Folder Explanation
1. `lib`: contains configuration and utility code files.
2. `images`: contains UI image.
3. `app.py`: main function that will run the Streamlit UI.
4. `lib/conifg.py`: contains functions to load our environment variables and get our api keys.
5. `lib/utils.py`: contains a function to setup our llm, accept user query and provide response.## Installation
1. Prerequisites
- Git
- Command line familiarity
2. Clone the Repository: `git clone https://github.com/NebeyouMusie/Reflection-70B-Chat.git`
3. Create and Activate Virtual Environment (Recommended)
- `python -m venv venv`
- `source venv/bin/activate` for Mac and `venv/bin/activate` for Windows
4. Navigate to the projects directory `cd ./Reflection-70B-Chat` using your terminal
5. Install Libraries: `pip install -r requirements.txt`
6. Enter your `OPENROUTER_API_KEY` in the `example.env` file then change the file to `.env`. You can get your `OPENROUTER_API_KEY` from [here](https://openrouter.ai/settings/keys).
7. run `streamlit run app.py`
8. open the link that appears on your terminal in your preferred browser.## Usage
- Start by typing your question in the Chat Input located at the bottom of the app.
- The LLM will provide a response, which will include the action it took to give the answer to the user along with the main response(`output`).
- Note that it won't remember previous user interactions (NO MEMORY) as that functionality has not been added.## Collaboration
- Collaborations are welcomed ❤️## Acknowledgments
- I would like to thank [OpenRouter](https://openrouter.ai/)
## Contact
- LinkedIn: [Nebeyou Musie](https://www.linkedin.com/in/nebeyou-musie)
- Gmail: [email protected]
- Telegram: [Nebeyou Musie](https://t.me/NebeyouMusie)