Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/xmlrunner/unittest-xml-reporting
unittest-based test runner with Ant/JUnit like XML reporting.
https://github.com/xmlrunner/unittest-xml-reporting
jenkins junit-xml nose pytest python test-runner unittest xml-report xunit xunit-tests
Last synced: 5 days ago
JSON representation
unittest-based test runner with Ant/JUnit like XML reporting.
- Host: GitHub
- URL: https://github.com/xmlrunner/unittest-xml-reporting
- Owner: xmlrunner
- License: other
- Created: 2008-10-23T22:28:00.000Z (over 16 years ago)
- Default Branch: master
- Last Pushed: 2024-08-17T10:29:15.000Z (5 months ago)
- Last Synced: 2024-10-29T17:28:03.299Z (3 months ago)
- Topics: jenkins, junit-xml, nose, pytest, python, test-runner, unittest, xml-report, xunit, xunit-tests
- Language: Python
- Homepage:
- Size: 372 KB
- Stars: 295
- Watchers: 13
- Forks: 136
- Open Issues: 22
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-discoveries - unittest-xml-reporting - a unittest test runner that can save test results to XML files that can be consumed by a wide range of tools, such as build systems, IDEs and continuous integration servers _(`Python`)_ (Libraries)
README
[![License](https://img.shields.io/pypi/l/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Latest Version](https://img.shields.io/pypi/v/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Development Status](https://img.shields.io/pypi/status/unittest-xml-reporting.svg)](https://pypi.python.org/pypi/unittest-xml-reporting/)
[![Documentation Status](https://readthedocs.org/projects/unittest-xml-reporting/badge/?version=latest)](http://unittest-xml-reporting.readthedocs.io/en/latest/?badge=latest)[![codecov.io Coverage Status](https://codecov.io/github/xmlrunner/unittest-xml-reporting/coverage.svg?branch=master)](https://codecov.io/github/xmlrunner/unittest-xml-reporting?branch=master)
[![Coveralls Coverage Status](https://coveralls.io/repos/xmlrunner/unittest-xml-reporting/badge.svg?branch=master&service=github)](https://coveralls.io/github/xmlrunner/unittest-xml-reporting?branch=master)
[![Requirements Status](https://requires.io/github/xmlrunner/unittest-xml-reporting/requirements.svg?branch=master)](https://requires.io/github/xmlrunner/unittest-xml-reporting/requirements/?branch=master)# unittest-xml-reporting (aka xmlrunner)
A unittest test runner that can save test results to XML files in xUnit format.
The files can be consumed by a wide range of tools, such as build systems, IDEs
and continuous integration servers.## Requirements
* Python 3.7+
* Please note Python 3.6 end-of-life was in Dec 2021, last version supporting 3.6 was 3.1.0
* Please note Python 3.5 end-of-life was in Sep 2020, last version supporting 3.5 was 3.1.0
* Please note Python 2.7 end-of-life was in Jan 2020, last version supporting 2.7 was 2.5.2
* Please note Python 3.4 end-of-life was in Mar 2019, last version supporting 3.4 was 2.5.2
* Please note Python 2.6 end-of-life was in Oct 2013, last version supporting 2.6 was 1.14.0## Limited support for `unittest.TestCase.subTest`
https://docs.python.org/3/library/unittest.html#unittest.TestCase.subTest
`unittest` has the concept of sub-tests for a `unittest.TestCase`; this doesn't map well to an existing xUnit concept, so you won't find it in the schema. What that means, is that you lose some granularity
in the reports for sub-tests.`unittest` also does not report successful sub-tests, so the accounting won't be exact.
## Jenkins plugins
- Jenkins JUnit plugin : https://plugins.jenkins.io/junit/
- Jenkins xUnit plugin : https://plugins.jenkins.io/xunit/### Jenkins JUnit plugin
This plugin does not perform XSD validation (at time of writing) and should parse the XML file without issues.
### Jenkins xUnit plugin version 1.100
- [Jenkins (junit-10.xsd), xunit plugin (2014-2018)](https://github.com/jenkinsci/xunit-plugin/blob/14c6e39c38408b9ed6280361484a13c6f5becca7/src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd), version `1.100`.
This plugin does perfom XSD validation and uses the more lax XSD. This should parse the XML file without issues.
### Jenkins xUnit plugin version 1.104+
- [Jenkins (junit-10.xsd), xunit plugin (2018-current)](https://github.com/jenkinsci/xunit-plugin/blob/ae25da5089d4f94ac6c4669bf736e4d416cc4665/src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd), version `1.104`+.
This plugin does perfom XSD validation and uses the more strict XSD.
See https://github.com/xmlrunner/unittest-xml-reporting/issues/209
```
import io
import unittest
import xmlrunner# run the tests storing results in memory
out = io.BytesIO()
unittest.main(
testRunner=xmlrunner.XMLTestRunner(output=out),
failfast=False, buffer=False, catchbreak=False, exit=False)
```Transform the results removing extra attributes.
```
from xmlrunner.extra.xunit_plugin import transformwith open('TEST-report.xml', 'wb') as report:
report.write(transform(out.getvalue()))```
## JUnit Schema ?
There are many tools claiming to write JUnit reports, so you will find many schemas with minor differences.
We used the XSD that was available in the Jenkins xUnit plugin version `1.100`; a copy is available under `tests/vendor/jenkins/xunit-plugin/.../junit-10.xsd` (see attached license).
You may also find these resources useful:
- https://stackoverflow.com/questions/4922867/what-is-the-junit-xml-format-specification-that-hudson-supports
- https://stackoverflow.com/questions/11241781/python-unittests-in-jenkins
- [JUnit-Schema (JUnit.xsd)](https://github.com/windyroad/JUnit-Schema/blob/master/JUnit.xsd)
- [Windyroad (JUnit.xsd)](http://windyroad.com.au/dl/Open%20Source/JUnit.xsd)
- [a gist (Jenkins xUnit test result schema)](https://gist.github.com/erikd/4192748)## Installation
The easiest way to install unittest-xml-reporting is via
[Pip](http://www.pip-installer.org):````bash
$ pip install unittest-xml-reporting
````If you use Git and want to get the latest *development* version:
````bash
$ git clone https://github.com/xmlrunner/unittest-xml-reporting.git
$ cd unittest-xml-reporting
$ sudo python setup.py install
````Or get the latest *development* version as a tarball:
````bash
$ wget https://github.com/xmlrunner/unittest-xml-reporting/archive/master.zip
$ unzip master.zip
$ cd unittest-xml-reporting
$ sudo python setup.py install
````Or you can manually download the latest released version from
[PyPI](https://pypi.python.org/pypi/unittest-xml-reporting/).## Command-line
````bash
python -m xmlrunner [options]
python -m xmlrunner discover [options]# help
python -m xmlrunner -h
````e.g.
````bash
python -m xmlrunner discover -t ~/mycode/tests -o /tmp/build/junit-reports
````## Usage
The script below, adapted from the
[unittest](http://docs.python.org/library/unittest.html), shows how to use
`XMLTestRunner` in a very simple way. In fact, the only difference between
this script and the original one is the last line:````python
import random
import unittest
import xmlrunnerclass TestSequenceFunctions(unittest.TestCase):
def setUp(self):
self.seq = list(range(10))@unittest.skip("demonstrating skipping")
def test_skipped(self):
self.fail("shouldn't happen")def test_shuffle(self):
# make sure the shuffled sequence does not lose any elements
random.shuffle(self.seq)
self.seq.sort()
self.assertEqual(self.seq, list(range(10)))# should raise an exception for an immutable sequence
self.assertRaises(TypeError, random.shuffle, (1,2,3))def test_choice(self):
element = random.choice(self.seq)
self.assertTrue(element in self.seq)def test_sample(self):
with self.assertRaises(ValueError):
random.sample(self.seq, 20)
for element in random.sample(self.seq, 5):
self.assertTrue(element in self.seq)if __name__ == '__main__':
unittest.main(
testRunner=xmlrunner.XMLTestRunner(output='test-reports'),
# these make sure that some options that are not applicable
# remain hidden from the help menu.
failfast=False, buffer=False, catchbreak=False)
````### Reporting to a single file
````python
if __name__ == '__main__':
with open('/path/to/results.xml', 'wb') as output:
unittest.main(
testRunner=xmlrunner.XMLTestRunner(output=output),
failfast=False, buffer=False, catchbreak=False)
````### Doctest support
The XMLTestRunner can also be used to report on docstrings style tests.
````python
import doctest
import xmlrunnerdef twice(n):
"""
>>> twice(5)
10
"""
return 2 * nclass Multiplicator(object):
def threetimes(self, n):
"""
>>> Multiplicator().threetimes(5)
15
"""
return 3 * nif __name__ == "__main__":
suite = doctest.DocTestSuite()
xmlrunner.XMLTestRunner().run(suite)
````### Django support
In order to plug `XMLTestRunner` to a Django project, add the following
to your `settings.py`:````python
TEST_RUNNER = 'xmlrunner.extra.djangotestrunner.XMLTestRunner'
````Also, the following settings are provided so you can fine tune the reports:
|setting|default|values|description|
|-|-|-|-|
|`TEST_OUTPUT_VERBOSE`|`1`|`0\|1\|2`|Besides the XML reports generated by the test runner, a bunch of useful information is printed to the `sys.stderr` stream, just like the `TextTestRunner` does. Use this setting to choose between a verbose and a non-verbose output.|
|`TEST_OUTPUT_DESCRIPTIONS`|`False`|`True\|False`|If your test methods contains docstrings, you can display such docstrings instead of display the test name (ex: `module.TestCase.test_method`).
In order to use this feature, you have to enable verbose output by setting `TEST_OUTPUT_VERBOSE = 2`.
Only effects stdout and not XML output.|
|`TEST_OUTPUT_DIR`|`"."`|``|Tells the test runner where to put the XML reports. If the directory couldn't be found, the test runner will try to create it before generate the XML files.|
|`TEST_OUTPUT_FILE_NAME`|`None`|``|Tells the test runner to output a single XML report with this filename under `os.path.join(TEST_OUTPUT_DIR, TEST_OUTPUT_FILE_NAME)`.
Please note that for long running tests, this will keep the results in memory for a longer time than multiple reports, and may use up more resources.|## Contributing
We are always looking for good contributions, so please just fork the
repository and send pull requests (with tests!).If you would like write access to the repository, or become a maintainer,
feel free to get in touch.### Testing changes with `tox`
Please use `tox` to test your changes before sending a pull request.
You can find more information about `tox` at .```bash
$ pip install tox# basic sanity test, friendly output
$ tox -e pytest# all combinations
$ tox
```