https://github.com/ntt-dkiku/route-explainer
https://github.com/ntt-dkiku/route-explainer
Last synced: 8 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/ntt-dkiku/route-explainer
- Owner: ntt-dkiku
- License: other
- Created: 2024-02-15T10:34:40.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-05T12:04:35.000Z (over 1 year ago)
- Last Synced: 2024-04-18T22:56:00.912Z (over 1 year ago)
- Language: Python
- Size: 833 KB
- Stars: 8
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
#
RouteExplainer: An Explanation Framework for Vehicle Routing Problem (PAKDD 2024)
This repo is the official implementation of RouteExplainer: An Explanation Framework for Vehicle Routing Problem (PAKDD 2024).
RouteExplainer is the first explanation framework for Vehicle Routing Problem. It generates an explanation for the influence of a specific edge in a route with the counterfactual explanation framework.
The explanation text is generated by an LLM (GPT-4).
On Hugging Face Spaces, we publish a demo of interactive tourist route generation, which is a promising application of RouteExplainer. Please try it out for yourself.
## ๐ฆ Setup
We recommend using Docker to setup development environments. Please use the [Dockerfile](./Dockerfile) in this repository.
In the following, all commands are supposed to be run inside of the Docker container.
```
docker build -t route_explainer/route_explainer:1.0 .
```
You can run code interactively in the container after launching the container by the following command (<> indicates a placeholder, which you should replace according to your settings).
Please set the ```shm_size``` as large as possible because the continuous reuse of conventional solvers (e.g., Concorde and LKH) consumes a lot of shared memory.
The continuous reuse is required when generating datasets and evaluating edge classifiers.
```
docker run -it --rm -v :/workspace/app --name evrp-eps -p : --shm-size --gpus all route_explainer/route_explainer:1.0 bash
```
If you use LKH and Concorde, you need to install them by the following command. LKH and Concorde is required for reproducing experiments, but not for demo.
```
python install_solvers.py
```## ๐ง Usage
Here, we describe the general usage of our code. See [Reproducibility](#rep) for the reproducibility of the experiments in our paper.### 1. Generating synthetic data with labels
You can generate synthetic dataset by the following command. Here, the ```solver``` generatess routes for given VRP instances, and the ```classifier``` annotates the edges in the generated routes.
Main options are as follows (check other options by the ```-h``` option):
| parameter | options | remarks|
|-----------|---------|--------|
| problem | ```tsptw```, ```pctsp```, ```pctsptw```, ```cvrp```.| |
| solver | tsptw: ```ortools```, ```lkh```. pctsp: ```ortools```. pctsptw: ```ortools```. cvrp: ```ortools```, ```lkh```.| available solvers depend on the problem |
| classifier| tsptw, pctsp: ```ortools```, ```lkh```, ```concorde```. pctsptw: ```ortools```. cvrp: ```ortools```, ```lkh```. | available classifiers depend on the problem |```
python generate_dataset.py --problem --num_nodes --num_samples 128000 10000 10000 --solver --classifier --output_dir data --random_seed 1234 --annotation --parallel
```
The dataset are saved with the name: ```//__nodes_samples_seed.pkl```,
where ```data_type``` is ```train```, ```valid```, or ```eval```.### 2. Training
You can train the edge classifier by the following command. If you want to use cpu, please set the ```gpu``` option to ```-1```.
```
python train.py --problem --train_dataset_path --valid_dataset_path --model_checkpoint_path checkpoints/ --gpu
```### 3. Evaluating the edge classifier
You can evaluate the edge classifier by the following command. The best checkpoint in ```model_dir``` is automatically selected.
The ```parallel``` option here enables parallel dataset loading.
Please ensure that the problem the model was trained on and the problem of the evaluation dataset are identical.
```
python eval_classifier.py --model_type nn --model_dir checkpoint/ --dataset_path --gpu --parallel
```## ๐ฌ Explanation generation (demo)
Go to https://localhost:8888 (replace 8888 with your ```container_port```) after launching the streamlit app by the following command.
This is a standalone demo, so you may skip the above experiments and try this first.
```
streamlit run app.py --server.port
```
We also publish this demo on Hugging Face Spaces, so you can easily try it there.## ๐ฝ Datasets and checkpoints (Work in progress)
Coming Soon!## ๐งช Reproducibility (Work in progress)
Coming Soon!
## ๐ Bug reports and questions
If you encounter a bug or have any questions, please post issues in this repo.## ๐ Licence
Our code is licenced by NTT. Basically, the use of our code is limitted to research purposes. See [LICENSE](./LICENSE) for more details.## ๐ค Citation
If you find this work useful, please cite our paper as follows:
```
@article{dkiku2024routeexplainer,
author = {Daisuke Kikuta and Hiroki Ikeuchi and Kengo Tajiri and Yuusuke Nakano},
title = {RouteExplainer: An Explanation Framework for Vehicle Routing Problem},
year = 2024,
journal = {arXiv preprint arXiv:2403.03585}
url = {https://arxiv.org/abs/2403.03585}
}
```