{"id":19809864,"url":"https://github.com/lewagon/nbresult","last_synced_at":"2025-08-10T04:33:20.247Z","repository":{"id":39978220,"uuid":"313576188","full_name":"lewagon/nbresult","owner":"lewagon","description":"Testing Library for Jupyter Notebooks","archived":false,"fork":false,"pushed_at":"2025-01-15T14:40:32.000Z","size":2037,"stargazers_count":8,"open_issues_count":0,"forks_count":2,"subscribers_count":10,"default_branch":"master","last_synced_at":"2025-07-10T22:21:09.235Z","etag":null,"topics":["notebook-jupyter","python","testing"],"latest_commit_sha":null,"homepage":"https://pypi.org/project/nbresult/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lewagon.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2020-11-17T09:57:30.000Z","updated_at":"2025-04-04T04:20:36.000Z","dependencies_parsed_at":"2024-02-22T19:29:01.067Z","dependency_job_id":"9ea26ca6-5697-4b4c-970d-b3bffed9fb51","html_url":"https://github.com/lewagon/nbresult","commit_stats":{"total_commits":44,"total_committers":4,"mean_commits":11.0,"dds":"0.36363636363636365","last_synced_commit":"2588ad55a6bedf1a791323fcbfa04d83d50e233d"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/lewagon/nbresult","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewagon%2Fnbresult","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewagon%2Fnbresult/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewagon%2Fnbresult/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewagon%2Fnbresult/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lewagon","download_url":"https://codeload.github.com/lewagon/nbresult/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lewagon%2Fnbresult/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":269677392,"owners_count":24457845,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-10T02:00:08.965Z","response_time":71,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["notebook-jupyter","python","testing"],"created_at":"2024-11-12T09:18:30.540Z","updated_at":"2025-08-10T04:33:20.187Z","avatar_url":"https://github.com/lewagon.png","language":"Python","funding_links":[],"categories":[],"sub_categories":[],"readme":"# nbresult\n\nA simple package to test Jupyter notebook result for the Le Wagon's Data Science Bootcamp.\n\n## 1. Installation\n\nInstallation with `pip` from [Pypi](https://pypi.org/):\n\n```bash\npip install nbresult\n```\n\n## 2. Usage\n\n### 2.1 Basic usage\nConsidering the default data challenge architecture:\n\n```bash\n.\n├── challenge.ipynb\n├── Makefile\n├── README.md\n├── data\n│   └── data.csv\n└── tests\n    └── __init__.py\n```\n\nIf you want to test a variable `log_model_score` from the `challenge.ipynb` notebook with `pytest`:\n\n![variable](img/variable.png)\n\nAnywhere in the notebook you can add a cell with the following code:\n\n```python\nfrom nbresult import ChallengeResult\n\nresult = ChallengeResult('score',\n    score=log_model_score\n)\nresult.write()\n```\n\nThis outputs a `score.pickle` file in the `tests` directory:\n\n```bash\n.\n├── challenge.ipynb\n├── Makefile\n├── README.md\n├── data\n│   └── data.csv\n└── tests\n    ├── __init__.py\n    └── score.pickle\n```\n\nNow you would like to write test on the `log_model_score` with `pytest`. Create a `test_score.py` file:\n\n```python\n# tests/test_score.py\nfrom nbresult import ChallengeResultTestCase\n\n\nclass TestScore(ChallengeResultTestCase):\n\n    def test_score_is_above_82(self):\n        self.assertEqual(self.result.score \u003e 0.82, True)\n```\n\nFinally you can run your tests with `pytest`:\n\n```bash\npytest tests/test_score.py\n```\n\n![pytest](img/pytest_check.png)\n\nOR\n\nRun the tests with `make`:\n- Setup a `Makefile`\n\n```make\n# Makefile\n\ndefault: pytest\n\npytest:\n  PYTHONDONTWRITEBYTECODE=1 pytest -v --color=yes\n```\n\n- Run `make`\n\n![make](img/make_check.png)\n\nOR\n\nRun the tests inside the notebook:\n\n```python\nfrom nbresult import ChallengeResult\n\nresult = ChallengeResult('score',\n    score=log_model_score\n)\nresult.write()\nprint(result.check())\n```\n\n![notebook](img/notebook_check.png)\n\n### 2.2 Advanced usage\nFor more advanced folder structure, you also can specify a `subdir` folder in which store \u0026 read pickle file\n\n```python\nfrom nbresult import ChallengeResult\n\nresult = ChallengeResult('score',\n    subdir='a', # This will store pickle in tests/a/score.pickle\n    score=log_model_score\n)\nresult.write()\nresult.check()\n```\n\nCheck out detailed example below\n\n![subdir](img/subdir_demo.png)\n\n## Testing\n\nRun `make`\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flewagon%2Fnbresult","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flewagon%2Fnbresult","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flewagon%2Fnbresult/lists"}