https://github.com/answerdotai/fastprogress
Simple and flexible progress bar for Jupyter Notebook and console
https://github.com/answerdotai/fastprogress
developer-tools jupyter-notebook plots python
Last synced: 17 days ago
JSON representation
Simple and flexible progress bar for Jupyter Notebook and console
- Host: GitHub
- URL: https://github.com/answerdotai/fastprogress
- Owner: AnswerDotAI
- License: apache-2.0
- Created: 2018-08-17T04:01:23.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2024-08-20T15:56:03.000Z (over 1 year ago)
- Last Synced: 2025-11-12T01:39:02.340Z (2 months ago)
- Topics: developer-tools, jupyter-notebook, plots, python
- Language: Jupyter Notebook
- Homepage:
- Size: 12.5 MB
- Stars: 1,100
- Watchers: 21
- Forks: 104
- Open Issues: 22
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE-OF-CONDUCT.md
Awesome Lists containing this project
README
# fastprogress
A fast and simple progress bar for Jupyter Notebook and console.

## Install
To install simply use
```
pip install fastprogress
```
## Usage
### Example 1
Here is a simple example. Each bar takes an iterator as a main argument, and we can specify the second bar is nested with the first by adding the argument `parent=mb`. We can then:
- add a comment in the first bar by changing the value of `mb.main_bar.comment`
- add a comment in the second bar by changing the value of `mb.child.comment`
- write a line between the two bars with `mb.write('message')`
``` python
from fastprogress.fastprogress import *
from time import sleep
for i in (mb:=master_bar(range(10))):
for j in mb.progress(range(100)):
sleep(0.01)
mb.child.comment = f'second bar stat'
mb.main_bar.comment = f'first bar stat'
mb.write(f'Finished loop {i}.')
```

### Example 2
To add a graph that get plots as the training goes, just use the command `mb.update_graphs`. It will create the figure on its first use. Arguments are:
- `graphs`: a list of graphs to be plotted (each of the form `[x,y]`)
- `x_bounds`: the min and max values of the x axis (if `None`, it will those given by the graphs)
- `y_bounds`: the min and max values of the y axis (if `None`, it will those given by the graphs)
Note that it's best to specify `x_bounds` and `y_bounds`, otherwise the box will change as the loop progresses.
Additionally, we can give the label of each graph via the command `mb.names` (should have as many elements as the graphs argument).
``` python
import numpy as np
for i in mb:=master_bar(range(10), names=['cos', 'sin']):
for j in mb.progress(range(100)):
if j%10 == 0:
k = 100 * i + j
x = np.arange(0, 2*k*np.pi/1000, 0.01)
y1, y2 = np.cos(x), np.sin(x)
graphs = [[x,y1], [x,y2]]
x_bounds = [0, 2*np.pi]
y_bounds = [-1,1]
mb.update_graph(graphs, x_bounds, y_bounds)
mb.child.comment = f'second bar stat'
mb.main_bar.comment = f'first bar stat'
mb.write(f'Finished loop {i}.')
```

Here is the rendering in console:

If the script using this is executed with a redirect to a file, only the results of the `.write` method will be printed in that file.
### Example 3
Here is an example that a typical machine learning training loop can use. It also demonstrates how to set `y_bounds` dynamically.
```
def plot_loss_update(epoch, epochs, mb, train_loss, valid_loss):
""" dynamically print the loss plot during the training/validation loop.
expects epoch to start from 1.
"""
x = range(1, epoch+1)
y = np.concatenate((train_loss, valid_loss))
graphs = [[x,train_loss], [x,valid_loss]]
x_margin = 0.2
y_margin = 0.05
x_bounds = [1-x_margin, epochs+x_margin]
y_bounds = [np.min(y)-y_margin, np.max(y)+y_margin]
mb.update_graph(graphs, x_bounds, y_bounds)
```
And here is an emulation of a training loop that uses this function:
```
import random
epochs = 5
train_loss, valid_loss = [], []
for epoch in (mb:=master_bar(range(1, epochs+1))):
# emulate train sub-loop
for batch in progress_bar(range(2), parent=mb): sleep(0.2)
train_loss.append(0.5 - 0.06 * epoch + random.uniform(0, 0.04))
# emulate validation sub-loop
for batch in progress_bar(range(2), parent=mb): sleep(0.2)
valid_loss.append(0.5 - 0.03 * epoch + random.uniform(0, 0.04))
plot_loss_update(epoch, epochs, mb, train_loss, valid_loss)
```
And the output:

----
Copyright 2017 onwards, fast.ai.