https://github.com/poloclub/recast
https://github.com/poloclub/recast
Last synced: 5 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/poloclub/recast
- Owner: poloclub
- License: mit
- Created: 2021-05-14T17:04:35.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2022-12-07T14:55:16.000Z (almost 3 years ago)
- Last Synced: 2024-05-12T00:47:37.703Z (over 1 year ago)
- Language: Svelte
- Size: 174 KB
- Stars: 4
- Watchers: 6
- Forks: 1
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# RECAST
An Interactive System to Understand End-User Recourse and Interpretability of Toxicity Detection Models
[](https://arxiv.org/abs/2102.04427)
For more information, check out our manuscript:
[**Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization**](https://arxiv.org/abs/2102.04427).
Austin P Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Duen Horng (Polo) Chau, and Diyi Yang
*Proc. ACM Hum.-Comput. Interact.*This project consists of two components: the frontend and backend.
Both have different setup procedures. Click on either folder to view the README for setting up each component.
To begin, clone or download this repository:
```
git clone 'repo'# use degit if you don't want to download commit histories
degit 'repo'
```## Consent Form
RECAST's "alternative suggestions" feature generates non-toxic alternatives to toxic text. In some instances, however, Recast recommends human-labeled toxic alternatives that are not detected as toxic by the backend model. Completing [this form](https://forms.gle/aNdrZ5TSNnG9WLSb9) indicates that a user/researcher agrees to not use Recast for the purpose of *maliciously* circumventing deployed toxicity classifiers.
As a precaution, we've withheld the trained Jigsaw model that Recast uses to generate these alternatives. Though it is possible to train a model on your own, completing this form will notify an author to send over a trained model file.
## Credits
RECAST was created by
Austin P Wright,
Omar Shaikh,
Haekyu Park,
Will Epperson,
Muhammed Ahmed,
Stephane Pinel,
Polo Chau, and
Diyi Yang## Citation
```bibTeX
@article{10.1145/3449280,
author = {Wright, Austin P. and Shaikh, Omar and Park, Haekyu and Epperson, Will and Ahmed, Muhammed and Pinel, Stephane and Chau, Duen Horng (Polo) and Yang, Diyi},
title = {RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization},
year = {2021},
issue_date = {April 2021},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {5},
number = {CSCW1},
url = {https://doi.org/10.1145/3449280},
doi = {10.1145/3449280},
journal = {Proc. ACM Hum.-Comput. Interact.},
month = apr,
articleno = {181},
numpages = {26},
keywords = {natural language processing, content moderation, interactive visualization, toxicity detection, intervention}
}
```## License
The software is available under the [MIT License](https://github.com/poloclub/RECAST/blob/master/LICENSE).
## Contact
If you have any questions, feel free to [open an issue](https://github.com/poloclub/RECAST/issues/new/choose) or contact [Omar Shaikh](https://oshaikh.com).