Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kmkolasinski/ipython-llm-magics
Simple wrapper for OpenAI chat to nicely parse and display the results returned from the API client
https://github.com/kmkolasinski/ipython-llm-magics
Last synced: 8 days ago
JSON representation
Simple wrapper for OpenAI chat to nicely parse and display the results returned from the API client
- Host: GitHub
- URL: https://github.com/kmkolasinski/ipython-llm-magics
- Owner: kmkolasinski
- License: mit
- Created: 2024-10-20T08:45:23.000Z (17 days ago)
- Default Branch: main
- Last Pushed: 2024-10-22T05:10:53.000Z (15 days ago)
- Last Synced: 2024-10-23T07:29:18.463Z (14 days ago)
- Language: Python
- Size: 5.06 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: HISTORY.md
- License: LICENSE
Awesome Lists containing this project
README
# LLM Magics for Jupyter Notebooks
![CI](https://github.com/kmkolasinski/ipython-llm-magics/actions/workflows/main.yml/badge.svg)
Interact seamlessly with Large Language Models (LLMs) like OpenAI's GPT models directly from your Jupyter Notebook using IPython magics.
https://pypi.org/project/llm-magics/
```bash
pip install llm-magics
```![Description of GIF](resources/demo.gif)
## Features
- **Interactive Chat Interface**: Engage in a conversational exchange with LLMs within your notebook cells.
- **Customizable Models**: Switch between different OpenAI chat models (e.g., `gpt-3.5-turbo`, `gpt-4`).
- **Set System Messages**: Define system prompts to guide the behavior of the LLM.
- **Persistent Chat History**: Maintain context across multiple interactions.
- **Rich Rendering**: Receive responses with proper formatting, including syntax-highlighted code blocks.
- **Easy Clearing**: Reset the conversation when needed.## Installation
### Prerequisites
- Python 3.10 or higher
- Jupyter Notebook or JupyterLab
- An [OpenAI API key](https://platform.openai.com/account/api-keys)### Steps
1. **Clone the Repository**
```bash
git clone https://github.com/kmkolasinski/ipython-llm-magics
cd llm-magics
```2. **Install Dependencies**
Install the required Python packages:
```bash
pip install -r requirements.txt
```3. **Install the Package**
```bash
pip install .
```## Configuration
### Setting the OpenAI API Key
Before using the magics, ensure your OpenAI API key is accessible to the application.
#### Option 1: Environment Variable
Set the `OPENAI_API_KEY` environment variable in your shell:
```bash
export OPENAI_API_KEY='your-api-key-here'
```#### Option 2: Within the Notebook
Alternatively, set the API key within your notebook:
```python
import os
os.environ['OPENAI_API_KEY'] = 'your-api-key-here'
```## Usage
### Loading the Extension
In your Jupyter Notebook, load the `llm_magics` extension:
```python
%load_ext llm_magics
```### Setting the Model
Specify the OpenAI model you want to use:
```python
%llm_set_model gpt-4o
```*Available models include `gpt-3.5-turbo`, `gpt-4o`, etc.*
### Setting a System Message (Optional)
Define a system message to guide the assistant's behavior:
```python
%llm_set_system_message "You are a helpful assistant."
```### Starting a Conversation
Use the `%%llm_chat` cell magic to send a message to the LLM:
```python
%%llm_chatWrite a Python function that generates a random integer between 1 and 100.
```**Response:**
*The assistant will provide a Python function as per your request.*
### Continuing the Conversation
Maintain context across multiple `%%llm_chat` cells:
```python
%%llm_chatNow modify the function to generate a random floating-point number between 0 and 1.
```**Response:**
*The assistant will update the function accordingly.*
### Inserting local variables into the chat
```python
not_sorted_list = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]
expected_output = sorted(not_sorted_list)
```
```python
%%llm_chatWrite python code to sort the list in the descending order using merge sort algorithm.
So that I can write:
input = $not_sorted_list
assert $expected_output == my_merge_sort(input)
```### Clearing the Conversation History
Reset the chat history when needed:
```python
%llm_clear
```## Rendering and Syntax Highlighting
The extension includes rich rendering of the LLM's responses:
- Code blocks are syntax-highlighted using [Prism.js](https://prismjs.com/) and [Highlight.js](https://highlightjs.org/).
- Copy buttons are added to code blocks for convenience.
- Markdown formatting is preserved in the responses.# History
- **v1.1.0**: Initial release with basic chat functionality and rendering.