{"id":23525388,"url":"https://github.com/pyscript/upytest","last_synced_at":"2025-10-31T17:30:35.038Z","repository":{"id":248554851,"uuid":"827887931","full_name":"pyscript/upytest","owner":"pyscript","description":"A very simple pytest like module for testing code written with PyScript using MicroPython or Pyodide as the runtime.","archived":false,"fork":false,"pushed_at":"2025-02-03T12:40:49.000Z","size":53,"stargazers_count":2,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-02-13T02:51:24.279Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://pyscript.net/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pyscript.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.md","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-07-12T15:42:06.000Z","updated_at":"2025-02-03T12:38:55.000Z","dependencies_parsed_at":"2024-10-28T19:30:16.167Z","dependency_job_id":null,"html_url":"https://github.com/pyscript/upytest","commit_stats":{"total_commits":43,"total_committers":1,"mean_commits":43.0,"dds":0.0,"last_synced_commit":"6b7b3839535aa45eb9a87afa28df03eb8bc44612"},"previous_names":["ntoll/upytest","pyscript/upytest"],"tags_count":10,"template":false,"template_full_name":"ntoll/codespaces-project-template-pyscript","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pyscript%2Fupytest","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pyscript%2Fupytest/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pyscript%2Fupytest/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pyscript%2Fupytest/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pyscript","download_url":"https://codeload.github.com/pyscript/upytest/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":239221241,"owners_count":19602378,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-12-25T19:08:44.416Z","updated_at":"2025-10-31T17:30:34.978Z","avatar_url":"https://github.com/pyscript.png","language":"Python","funding_links":[],"categories":[],"sub_categories":[],"readme":"# uPyTest (MicroPytest) 🔬🐍✔️\n\nA small and very limited module for very simple [PyTest](https://pytest.org) \ninspired tests to run in the [MicroPython](https://micropython.org/) and\n[Pyodide](https://pyodide.org/) interpreters within \n[PyScript](https://pyscript.net/). \n\nIt currently only implements naive versions of:\n\n* Discovery of tests on the filesystem.\n* `assert` statements for testing state.\n* `assert \u003csomething\u003e, \"Some description\"` to add contextual information.\n* Global `setup` and `teardown` functions via `conftest.py`.\n* Module specific `setup` and `teardown` functions.\n* A `skip(\"reason\")` decorator for skipping test functions.\n* Checks for expected exceptions via a `raises` context manager.\n* Synchronous and asynchronous test cases.\n* Works well with [uMock](https://github.com/ntoll/umock).\n\nThere are two major reasons this project exists:\n\n1. MicroPython doesn't have a test framework like PyTest, and folks want to\n   test PyScript code running in MicroPython.\n2. Using the same test framework with both MicroPython and Pyodide will ensure\n   the test suite can exercise your code running on both interpreters (and\n   perhaps highlight places where behaviour differs).\n\nOf course, **you should write tests for your code**! If only because it means\nyou'll be able to make changes in the future with confidence. The aim of\n`upytest` is to make this is simple as possible, in a way that is familiar to\nthose who use PyTest, when using PyScript.\n\n## Usage\n\n**This module is for use within PyScript.**\n\n### Setup / Run tests\n\n1. Ensure the `upytest.py` file is in your Python path. You may need to copy\n   this over using the \n   [files settings](https://docs.pyscript.net/2024.8.2/user-guide/configuration/#files). \n   (See the `config.json` file in this repository for an example of this in \n   action.)\n2. Create and copy over your tests. Once again use the files settings, and the\n   `config.json` in this repository demonstrates how to copy over the content\n   of the `tests` directory found in this repository.\n3. In your `main.py` (or whatever you call your Python script for starting the\n   tests), simply `import upytest` and await the `run` method while passing in\n   one or more strings indicating the tests to run:\n   ```python\n   import upytest\n\n\n   results = await upytest.run(\"./tests\")\n   ```\n   (This is demonstrated in the `main.py` file in this repository.)\n4. The specification may be simply a string describing the directory in\n   which to start looking for test modules (e.g. `\"./tests\"`), or strings\n   representing the names of specific test modules / test classes, tests to run\n   (of the form: \"module_path\", \"module_path::TestClass\" or\n   \"module_path::test_function\"; e.g. `\"tests/test_module.py\"`, \n   `\"tests/test_module.py::TestClass\"` or\n   `\"tests/test_module.py::test_stuff\"`).\n5. If a named `pattern` argument is provided, it will be used to match test\n   modules in the specification for target directories. The default pattern is\n   \"test_*.py\".\n6. If a named `random` boolean argument is provided (default: `False`), then\n   the order in which modules and tests are run will be randomized.\n7. If there is a `conftest.py` file in any of the specified directories\n   containing a test module, it will be imported for any global `setup` and\n   `teardown` functions to use for modules found within that directory. These\n   `setup` and `teardown` functions can be overridden in the individual test\n   modules.\n8. The `result` of awaiting `upytest.run` is a Python dictionary containing \n   lists of tests bucketed under the keys: `\"passes\"`, `\"fails\"` and \n   `\"skipped\"`. The result also provides information about the Python\n   interpreter used to run the tests, long with a boolean flag to indicate if\n   the tests were running in a web worker. These results are JSON serializable\n   and can be used for further processing and analysis (again, see `main.py`\n   for an example of this in action.)\n9. In your `index.html` make sure you use the `terminal` attribute\n   when referencing your Python script (as in the `index.html` file in\n   this repository):\n   ```html\n   \u003cscript type=\"mpy\" src=\"./main.py\" config=\"./config.json\" terminal\u003e\u003c/script\u003e\n   ```\n   You should be able to use the `type` attribute of `\"mpy\"` (for MicroPython)\n   and `\"py\"` (for Pyodide) interchangeably.\n\nFinally, point your browser at your `index.html` and you should see the test\nsuite run.\n\n### Writing tests\n\n**`upytest` is only _inspired by PyTest_ and is not intended as a replacement.**\n\nSome of the core concepts and capabilities used in `upytest` will be familiar \nfrom using PyTest, but the specific API, capabilities and implementation\ndetails _will be very different_.\n\nTo create a test suite ensure your test functions are contained in modules,\nwhose names start with `test_`, found inside your `test` directory. If you want\nto change this pattern for matching test modules pass in a `pattern` argument\nas a string to the `upytest.run` method (whose default is currently\n`pattern=\"test*.py\"`).\n\nInside the test module, test functions are identified by having `test_`\nprepended to their name:\n\n```python\ndef test_something():\n    assert True, \"This will not fail.\"\n```\n\nJust like PyTest, use the `assert` statement to verify test expectations. As\nshown above, a string following a comma is used as the value for any resulting\n`AssertionError` should the `assert` fail.\n\nIf you need to group tests together within a test module, use a class\ndefinition whose name starts with `Test` and whose test methods start with\n`test_`:\n\n```python\nclass TestClass:\n\n   def test_something(self):\n      assert True, \"This will not fail\"\n```\n\nSometimes you need to skip existing tests. Simply use the `skip` decorator like\nthis:\n\n```python\nimport upytest\n\n\n@upytest.skip(\"This is my reason for skipping the test\")\ndef test_skipped():\n    assert False, \"This won't fail, because it's skipped!\"\n```\n\nThe `skip` decorator takes an optional string to describe why the test function\nis to be skipped. It also takes an optional `skip_when` argument whose default\nvalue is `True`. If `skip_when` is false-y, the decorated test **will NOT be \nskipped**. This is useful for conditional skipping of tests. E.g.:\n\n```python\nimport upytest\n\n\n@skip(\"Skip this if using MicroPython\", skip_when=upytest.is_micropython)\ndef test_something():\n   assert 1 == 1  # Only asserted if using Pyodide.\n```\n\nOften you need to check a certain exception is raised when a problematic state\nis achieved. To do this use the `raises` context manager like this:\n\n```python\nimport upytest\n\n\ndef test_raises_exception():\n    with upytest.raises(ValueError, KeyError):\n        raise ValueError(\"BOOM!\")\n```\n\nThe `raises` context manager requires one or more expected exceptions that\nshould be raised while the code within its context is evaluated. If no such\nexceptions are raised, the test fails.\n\nSometimes you need to perform tasks either before or after a number of tests\nare run. For example, they might be needed to create a certain state, or clean\nup and reset after tests are run. These tasks are achieved by two functions\ncalled `setup` (run immediately before tests) and `teardown` (run immediately \nafter tests).\n\nThese functions are entirely optional and should be defined in two possible\nplaces:\n\n* In a `conftest.py` file in the root of your test directory. Any `setup` or\n  `teardown` function defined here will be _applied to all tests_, unless\n  you override these functions...\n* In individual test modules. The `setup` and `teardown` functions in test\n  modules _replace any global versions of these functions defined in \n  conftest.py_. They only apply to _test functions found within the module_ in\n  which they are defined. If you still need to run the global functions, just \n  import them and call them from within your test module versions.\n\nAll test functions along with `setup` and `teardown` can be awaitable /\nasynchronous.\n\nAll these features are demonstrated within the test modules in the `tests`\ndirectory of this project.\n\n### Test output\n\nTest output tries to be informative, indicating the time taken, the number of\ntests, the number of passes, fails and skips along with tracebacks for\nfailures.\n\nDue to the small nature of MicroPython, the information from the traceback for\nfailing tests may not appear as comprehensive as the information you may be\nused to see after a run of classic PyTest. Nevertheless, line numbers and the\ncall stack are included to provide you with enough information to see what has\nfailed, and where.\n\nWhen outputting a test run a `.` represents a passing test, an `F` a failure\nand `S` a skipped test.\n\nThe output for the test suite for this module is a good example of all the\ndifferent sorts of information you may see:\n\n```\nPython interpreter:  webassembly 3.4.0; MicroPython v1.24.0-preview.114.g77bd8fe5b on 2024-07-19 \nRunning in worker:  False \nUsing tests/conftest.py for global setup and teardown in tests/test_core_functionality.py::TestClass.\nFound 1 test module[s]. Running 8 test[s].\n\nF.FSSF..\n================================= FAILURES =================================\nFailed: tests/test_core_functionality.py::TestClass.test_does_not_raise_exception_fails\nTraceback (most recent call last):\n  File \"upytest.py\", line 156, in run\n  File \"tests/test_core_functionality.py\", line 127, in test_does_not_raise_exception_fails\nAssertionError: Did not raise expected exception. Expected ValueError; but got None.\n\n\nFailed: tests/test_core_functionality.py::TestClass.test_fails\nTraceback (most recent call last):\n  File \"upytest.py\", line 156, in run\n  File \"tests/test_core_functionality.py\", line 119, in test_fails\nAssertionError: This test will fail\n\n\nFailed: tests/test_core_functionality.py::TestClass.test_does_not_raise_expected_exception_fails\nTraceback (most recent call last):\n  File \"upytest.py\", line 156, in run\n  File \"tests/test_core_functionality.py\", line 131, in test_does_not_raise_expected_exception_fails\nAssertionError: Did not raise expected exception. Expected ValueError, AssertionError; but got TypeError.\n\n================================= SKIPPED ==================================\nSkipped: tests/test_core_functionality.py::TestClass.test_skipped\nReason: This test will be skipped\n\nSkipped: tests/test_core_functionality.py::TestClass.test_when_skipped\nReason: This test will be skipped with a skip_when condition\n========================= short test summary info ==========================\n3 failed, 2 skipped, 3 passed in 0.00 seconds\n```\n\n## Developer setup\n\nThis is easy:\n\n1. Clone the project.\n2. Start a local web server: `python -m http.server`\n3. Point your browser at http://localhost:8000/\n4. Change code and refresh your browser to check your changes.\n5. **DO NOT CREATE A NEW FEATURE WITHOUT FIRST CREATING AN ISSUE FOR IT IN WHICH\n   YOU PROPOSE YOUR CHANGE**. (We want to avoid a situation where you work hard\n   on something that is ultimately rejected by the maintainers.)\n6. Given all the above, pull requests are welcome and greatly appreciated.\n\nWe expect all contributors to abide by the spirit of our\n[code of conduct](./CODE_OF_CONDUCT.md).\n\n## Testing uPyTest\n\nSee the content of the `tests` directory in this repository. To run the test\nsuite, just follow steps 1, 2 and 3 in the developer setup section. The\n`main.py` script tests the test framework itself. From the docstring for that\nmodule:\n\n\u003e How do you test a test framework?\n\u003e\n\u003e You can't use the test framework to test itself, because it may contain bugs!\n\u003e Hence this script, which uses upytest to run tests and check the results are as\n\u003e expected. The expected results are hard-coded in this script, and the actual\n\u003e results are generated by running tests with upytest. The script then compares\n\u003e the expected and actual results to ensure they match.\n\u003e\n\u003e Finally, the script creates a div element to display the results in the page.\n\u003e If tests fail, the script will raise an AssertionError, which will be\n\u003e displayed with a red background. If the tests pass, the script will display a\n\u003e message with a green background.\n\u003e\n\u003e There are two sorts of expected results: the number of tests that pass, fail,\n\u003e and are skipped, and the names of the tests that pass, fail, and are skipped.\n\u003e Tests that pass end with \"passes\", tests that fail end with \"fails\", and tests\n\u003e that are skipped end with \"skipped\".\n\u003e\n\u003e This script will work with both MicroPython and Pyodide, just so we can ensure\n\u003e the test framework works in both environments. The index.html file uses\n\u003e MicroPython, the index2.html file uses Pyodide.\n\u003e\n\u003e That's it! Now we can test a test framework with a meta-test framework. 🤯\n\n## License\n\nCopyright (c) 2024 Nicholas H.Tollervey\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpyscript%2Fupytest","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpyscript%2Fupytest","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpyscript%2Fupytest/lists"}