{"id":13994452,"url":"https://github.com/httprunner/PyUnitReport","last_synced_at":"2025-07-22T19:32:26.357Z","repository":{"id":57458183,"uuid":"97014946","full_name":"httprunner/PyUnitReport","owner":"httprunner","description":"A unit test runner for Python, and generate HTML reports.","archived":false,"fork":true,"pushed_at":"2022-10-18T17:47:46.000Z","size":738,"stargazers_count":61,"open_issues_count":5,"forks_count":14,"subscribers_count":7,"default_branch":"master","last_synced_at":"2024-09-20T02:17:35.320Z","etag":null,"topics":["html-report","test-runner","unittest"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":"oldani/HtmlTestRunner","license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/httprunner.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-07-12T14:04:52.000Z","updated_at":"2024-09-01T15:33:33.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/httprunner/PyUnitReport","commit_stats":null,"previous_names":[],"tags_count":6,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/httprunner%2FPyUnitReport","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/httprunner%2FPyUnitReport/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/httprunner%2FPyUnitReport/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/httprunner%2FPyUnitReport/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/httprunner","download_url":"https://codeload.github.com/httprunner/PyUnitReport/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":227166727,"owners_count":17740971,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["html-report","test-runner","unittest"],"created_at":"2024-08-09T14:02:52.977Z","updated_at":"2024-11-29T16:31:24.863Z","avatar_url":"https://github.com/httprunner.png","language":"Python","readme":"# PyUnitReport\n\nPyUnitReport is a unittest test runner that save test results in Html files, for human readable presentation of results.\n\n## Installation\n\n```bash\n$ pip install PyUnitReport\n```\n\n## Usage\n\n### testcase\n\n```python\nfrom pyunitreport import HTMLTestRunner\nimport unittest\n\nclass TestStringMethods(unittest.TestCase):\n    \"\"\" Example test for HtmlRunner. \"\"\"\n\n    def test_upper(self):\n        self.assertEqual('foo'.upper(), 'FOO')\n\n    def test_isupper(self):\n        self.assertTrue('FOO'.isupper())\n        self.assertFalse('Foo'.isupper())\n\n    def test_split(self):\n        s = 'hello world'\n        self.assertEqual(s.split(), ['hello', 'world'])\n        # check that s.split fails when the separator is not a string\n        with self.assertRaises(TypeError):\n            s.split(2)\n\n    def test_error(self):\n        \"\"\" This test should be marked as error one. \"\"\"\n        raise ValueError\n\n    def test_fail(self):\n        \"\"\" This test should fail. \"\"\"\n        self.assertEqual(1, 2)\n\n    @unittest.skip(\"This is a skipped test.\")\n    def test_skip(self):\n        \"\"\" This test should be skipped. \"\"\"\n        pass\n\nif __name__ == '__main__':\n    unittest.main(testRunner=HTMLTestRunner(output='example_dir'))\n```\n\nIn most cases, you can use `PyUnitReport` with `unittest.main`, just pass it with the `testRunner` keyword.\n\nFor `HTMLTestRunner`, the only parameter you must pass in is `output`, which specifies the directory of your generated report. Also, if you want to specify the report name, you can use the `report_name` parameter, otherwise the report name will be the datetime you run test. And if you want to run testcases in `failfast` mode, you can pass in a `failfast` parameter and assign it to be True.\n\nHere is another way to run the testcases.\n\n```python\nfrom pyunitreport import HTMLTestRunner\n\nkwargs = {\n    \"output\": output_folder_name,\n    \"report_name\": report_name,\n    \"failfast\": True\n}\nresult = HTMLTestRunner(**kwargs).run(task_suite)\n```\n\n### testsuite\n\nFor those who have `test suites` it works too, just create a runner instance and call the run method with your suite.\n\nHere is an example:\n\n```python\nfrom unittest import TestLoader, TestSuite\nfrom pyunitreport import HTMLTestRunner\nimport ExampleTest\nimport Example2Test\n\nexample_tests = TestLoader().loadTestsFromTestCase(ExampleTests)\nexample2_tests = TestLoader().loadTestsFromTestCase(Example2Test)\n\nsuite = TestSuite([example_tests, example2_tests])\nkwargs = {\n    \"output\": output_folder_name,\n    \"report_name\": report_name,\n    \"failfast\": True\n}\nrunner = HTMLTestRunner(**kwargs)\nrunner.run(suite)\n```\n\n## Output\n\n### Console output\n\nThis is an example of what you got in the console.\n\n```text\n$ python examples/testcase.py\n\nRunning tests...\n----------------------------------------------------------------------\n This test should be marked as error one. ... ERROR (0.000575)s\n This test should fail. ... FAIL (0.000564)s\n test_isupper (__main__.TestStringMethods) ... OK (0.000149)s\n This test should be skipped. ... SKIP (0.000067)s\n test_split (__main__.TestStringMethods) ... OK (0.000167)s\n test_upper (__main__.TestStringMethods) ... OK (0.000134)s\n\n======================================================================\nERROR [0.000575s]: This test should be marked as error one.\n----------------------------------------------------------------------\nTraceback (most recent call last):\n  File \"examples/testcase.py\", line 23, in test_error\n    raise ValueError\nValueError\n\n======================================================================\nFAIL [0.000564s]: This test should fail.\n----------------------------------------------------------------------\nTraceback (most recent call last):\n  File \"examples/testcase.py\", line 27, in test_fail\n    self.assertEqual(1, 2)\nAssertionError: 1 != 2\n\n----------------------------------------------------------------------\nRan 6 tests in 0.002s\n\nFAILED\n (Failures=1, Errors=1, Skipped=1)\n\nGenerating HTML reports...\nTemplate is not specified, load default template instead.\nReports generated: /Users/Leo/MyProjects/ApiTestEngine/src/pyunitreport/reports/example_dir/2017-07-26-23-33-49.html\n```\n\n### Html Output\n\n![html output](resources/html_output.gif)\n\n![html output](resources/html_output.png)\n","funding_links":[],"categories":["Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhttprunner%2FPyUnitReport","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhttprunner%2FPyUnitReport","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhttprunner%2FPyUnitReport/lists"}