https://github.com/britojr/diffi
Interpretation of Isolation Forests
https://github.com/britojr/diffi
anomaly-detection diffi explainability interpretability machine-learning
Last synced: about 1 month ago
JSON representation
Interpretation of Isolation Forests
- Host: GitHub
- URL: https://github.com/britojr/diffi
- Owner: britojr
- Created: 2020-07-25T15:25:38.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2022-10-08T14:54:46.000Z (over 3 years ago)
- Last Synced: 2023-03-01T12:51:26.858Z (about 3 years ago)
- Topics: anomaly-detection, diffi, explainability, interpretability, machine-learning
- Language: Python
- Homepage:
- Size: 6.84 KB
- Stars: 15
- Watchers: 0
- Forks: 5
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# diffi
This is an unofficial python implementation of the **DIFFI (Depth-based Isolation Forest Feature Importance)** Algorithm proposed by [[1]](#ref1).
A model-based approach to assess global interpretation, in terms of feature importance, of an Isolation Forest.
This implementation assumes that the model used is an instance of [scikit-learn's Isolation Forest](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.IsolationForest.html#sklearn.ensemble.IsolationForest).
## Usage
```python
from sklearn.ensemble import IsolationForest
from diffi.diffi import diffi_score
clf = IsolationForest()
clf.fit(X)
feature_importance = diffi_score(clf, X)
```
## References
Carletti, Mattia, Chiara Masiero, Alessandro Beghi, and Gian Antonio Susto. ["Explainable machine learning in industry 4.0: evaluating feature importance in anomaly detection to enable root cause analysis."](https://ieeexplore.ieee.org/abstract/document/8913901) In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 21-26. IEEE, 2019.