Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/sayakpaul/benchmarking-and-mli-experiments-on-the-adult-dataset

Contains benchmarking and interpretability experiments on the Adult dataset using several libraries
https://github.com/sayakpaul/benchmarking-and-mli-experiments-on-the-adult-dataset

data-science fastai h2oai interpretable-machine-learning machine-learning microsoft-interpret tensorflow

Last synced: 23 days ago
JSON representation

Contains benchmarking and interpretability experiments on the Adult dataset using several libraries

Awesome Lists containing this project

README

        

The initial experiments were a part of an assignment given from TCS ILP Innovations' Lab. Later as my appetite for the wonderful field of machine learning increased, I decided to give it another try and try out the new libraries.

It includes benchmarking and interpretability experiments on the [Adult Data set](https://archive.ics.uci.edu/ml/datasets/adult) using libraries like [`fastai`](docs.fast.ai), [`h2o`](http://docs.h2o.ai) and [`interpret`](https://github.com/Microsoft/interpret). Along with these, I have shown how one can use the `interpret` library to construct explanations for `sklearn` models. **Note** that `keras` models can be converted to `sklearn` variants and this enables `interpret` to work equally on these models as well.

I show you how easy it is to interpret a blackbox machine learning model with `interpret`. I think the library really stands its name. Along with this, I also show how to use Decision Tree Surrogate to explain models in `h2o`.

To do:
**Annotate the notebooks in plain English and include short explanations to the various interpretability methods used.**