{"id":13652362,"url":"https://github.com/Quantmetry/resources-intelligibility","last_synced_at":"2025-04-23T03:30:45.089Z","repository":{"id":111569460,"uuid":"158420515","full_name":"Quantmetry/resources-intelligibility","owner":"Quantmetry","description":"Some resources for intelligibility analysis of machine learning models. ","archived":false,"fork":false,"pushed_at":"2018-11-22T11:05:34.000Z","size":703,"stargazers_count":4,"open_issues_count":0,"forks_count":1,"subscribers_count":4,"default_branch":"master","last_synced_at":"2024-11-10T03:35:25.308Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Quantmetry.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2018-11-20T16:35:36.000Z","updated_at":"2022-11-05T09:43:23.000Z","dependencies_parsed_at":"2023-07-30T15:15:27.496Z","dependency_job_id":null,"html_url":"https://github.com/Quantmetry/resources-intelligibility","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Quantmetry%2Fresources-intelligibility","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Quantmetry%2Fresources-intelligibility/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Quantmetry%2Fresources-intelligibility/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Quantmetry%2Fresources-intelligibility/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Quantmetry","download_url":"https://codeload.github.com/Quantmetry/resources-intelligibility/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250365255,"owners_count":21418654,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-02T02:00:58.636Z","updated_at":"2025-04-23T03:30:41.902Z","avatar_url":"https://github.com/Quantmetry.png","language":"Jupyter Notebook","readme":"# resources-intelligibility\n\nSome resources for intelligibility analysis of machine learning models, mostly in French.\n\nNotes on setting up the project\n-------------------------------\n\n- with a version of *python3* installed (tested with python 3.6), make sure you have access to *pip*.\n- with the below instructions, create a local virtual environnment and activate it\n- install requirements.txt\n\n  ```\n  $ python3 -m venv .venv\n  $ source .venv/bin/activate\n  (.venv) $ pip install -r requirements.txt\n  ```\n\n- Go to the *data/* folder and download (~0.5Mo) the required data with the link you can find in *data/howtogetdata.txt*. At the end of this step, you should have a *carInsurance_train.csv* file in the *data/* folder.\n- Start a jupyter server.\n\n  ```\n  (.venv) $ jupyter notebook\n  ```\n\nFeatures\n--------\n\nIn the *notebooks/* folder, you will find some demos of several intelligibility techniques:\n\n- Partie1\\_Construction\\_Modèle.ipynb\n- Partie2\\_Analyse_sensibilité\\_des\\_prédictions.ipynb\n- Partie3\\_Décomposition\\_en\\_contributions.ipynb\n- Partie4\\_Décomposition\\_en\\_règles.ipynb\n\nYou should run *Partie1* first because it will write a pickle with data and model, used by other notebooks. Afterwards, notebooks are independant.\n\n\nCredits\n-------\nThis work has been done by Quantmetry R\u0026D, 2018.","funding_links":[],"categories":["Interpretability / Explainable AI"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FQuantmetry%2Fresources-intelligibility","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FQuantmetry%2Fresources-intelligibility","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FQuantmetry%2Fresources-intelligibility/lists"}