Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/charliegerard/safe-space
Github action that checks the toxicity level of comments and PR reviews to help make repos safe spaces.
https://github.com/charliegerard/safe-space
github-actions javascript machine-learning tensorflow tensorflowjs tfjs
Last synced: 6 days ago
JSON representation
Github action that checks the toxicity level of comments and PR reviews to help make repos safe spaces.
- Host: GitHub
- URL: https://github.com/charliegerard/safe-space
- Owner: charliegerard
- License: gpl-3.0
- Created: 2020-08-16T12:56:36.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2021-06-24T16:59:05.000Z (over 3 years ago)
- Last Synced: 2024-12-08T06:42:16.712Z (14 days ago)
- Topics: github-actions, javascript, machine-learning, tensorflow, tensorflowjs, tfjs
- Language: JavaScript
- Homepage:
- Size: 18.4 MB
- Stars: 472
- Watchers: 7
- Forks: 12
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE.md
Awesome Lists containing this project
README
# Safe space - Github action
Github action that uses machine learning to detect potential toxic comments added to PRs and issues so authors can have a chance to edit them and keep repos a safe space.
It uses the [Tensorflow.js toxicity classification model](https://github.com/tensorflow/tfjs-models/tree/master/toxicity).
It currently works when comments are posted on issues and PRs, as well as when pull request reviews are submitted.
## Demo
![](demo.gif)
If you want some details about how it works, feel free to check the [blog post](https://charliegerard.dev/blog/github-action-toxic-comments).
## How to use
_If you do not have any Github actions already set up in your repo, start by creating a .github/workflows folder._
Inside your workflows folder, create a new .yml file, for example `main.yml` and copy the following lines:
```yml
on: [issue_comment, pull_request_review]jobs:
toxic_check:
runs-on: ubuntu-latest
name: Safe space
steps:
- uses: actions/checkout@v2
- name: Safe space - action step
uses: charliegerard/safe-space@master
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
````GITHUB_TOKEN` is **required** (note that Github [automatically creates this token](https://docs.github.com/en/free-pro-team@latest/actions/reference/authentication-in-a-workflow#:~:text=and%20use%20secrets.-,About%20the%20GITHUB_TOKEN%20secret,authenticate%20in%20a%20workflow%20run.&text=The%20token's%20permissions%20are%20limited,%22Permissions%20for%20the%20GITHUB_TOKEN%20.%22)) but two other parameters are optional:
- `message` - a custom message you'd like to display in the automatic comment
- `toxicity_threshold` - a float number between 0 and 1. It will be used when loading the machine learning model. Its default value is 0.9.```yml
on: [issue_comment, pull_request_review]jobs:
toxic_check:
runs-on: ubuntu-latest
name: Toxicity check
steps:
- uses: actions/checkout@v2
- name: Safe space - action step
uses: charliegerard/safe-space@master
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
message: "this is my custom message"
toxicity_threshold: 0.7
```The action can take up to 40s to run so, if you are testing it out in your repository, keep in mind that the bot will not display right after a toxic comment is posted.