{"id":15043902,"url":"https://github.com/noconnor/junitperf","last_synced_at":"2025-04-05T14:06:15.061Z","repository":{"id":26383517,"uuid":"99240159","full_name":"noconnor/JUnitPerf","owner":"noconnor","description":"API performance testing framework built using JUnit","archived":false,"fork":false,"pushed_at":"2024-10-04T13:58:16.000Z","size":913,"stargazers_count":73,"open_issues_count":19,"forks_count":19,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-04-05T14:06:00.240Z","etag":null,"topics":["java","java-8","junit","latency","performance-analysis","performance-testing","testing-tools","unittest"],"latest_commit_sha":null,"homepage":null,"language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/noconnor.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-08-03T14:14:44.000Z","updated_at":"2025-02-18T20:03:12.000Z","dependencies_parsed_at":"2024-06-07T17:58:50.453Z","dependency_job_id":"053df3c2-0ca2-4cbe-b721-c2ec202ec4f2","html_url":"https://github.com/noconnor/JUnitPerf","commit_stats":null,"previous_names":[],"tags_count":51,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/noconnor%2FJUnitPerf","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/noconnor%2FJUnitPerf/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/noconnor%2FJUnitPerf/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/noconnor%2FJUnitPerf/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/noconnor","download_url":"https://codeload.github.com/noconnor/JUnitPerf/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247345852,"owners_count":20924102,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["java","java-8","junit","latency","performance-analysis","performance-testing","testing-tools","unittest"],"created_at":"2024-09-24T20:49:48.259Z","updated_at":"2025-04-05T14:06:15.039Z","avatar_url":"https://github.com/noconnor.png","language":"Java","readme":"# JUnitPerf ![Build Status](https://github.com/noconnor/JUnitPerf/actions/workflows/ci.yml/badge.svg) [![codecov](https://codecov.io/gh/noconnor/JUnitPerf/branch/master/graph/badge.svg)](https://codecov.io/gh/noconnor/JUnitPerf) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/com.github.noconnor/junitperf/badge.svg)](https://maven-badges.herokuapp.com/maven-central/com.github.noconnor/junitperf)\n\n\nAPI performance testing framework built using JUnit\n\nJUnitPerf provides extensions to the JUnit4 \u0026 JUnit5 frameworks, allowing unittests to be extended to operate as \nperformance evaluation tests. \n\nThis library is best suited for testing remote API endpoints or component/integration testing. \nIf attempting to benchmark code blocks with nanosecond latency then you should consider using [JMH](http://openjdk.java.net/projects/code-tools/jmh/)   \n\nThis library interface was heavily influenced by the interface in the deprecated \n[Contiperf library](https://github.com/lucaspouzac/contiperf) developed by [Lucas Pouzac](https://github.com/lucaspouzac)\n\n\u003cbr /\u003e\n\n## Contents\n\n[Usage Instructions](#usage-instructions)\n\n[Reports](#reports)\n\n[Statistics](#statistics)\n\n[Build Instructions](#build-instructions)\n\n\u003cbr /\u003e\n\n## Usage Instructions\n\nJunitPerf library supports both junit4 and junit5 bindings. \nUsage documentation for each binding can be found here:\n\n* [Junit4 usage documentation](docs/junit4.md)\n* [Junit5 usage documentation](docs/junit5.md)\n\n\n\u003cbr /\u003e\n\n## Test Configuration Options \n\n`@JUnitPerfTest` has the following configuration parameters:\n\n| Property                   | Definition                                                                                                                                                  | Default value  |\n|:---------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------:|\n| threads                    | The total number of threads to use during test execution                                                                                                    |       1        |\n| durationMs                 | Total time to run the test in millisecs (ms) (includes warmup period)                                                                                       |     60,000     |\n| warmUpMs                   | Warm up period in ms, test logic will be executed during warm up, but results will not be considered during statistics evaluation                           |       0        |\n| maxExecutionsPerSecond     | Sets the maximum number of iteration per second (disabled by default)                                                                                       |       -1       |\n| rampUpPeriodMs             | Framework ramps up its executions per second smoothly over the duration of this period (disabled by default)                                                |       0        |\n| totalExecutions            | A best effort target for the total number of times the test method should be executed, this setting takes precedence over durationMs (disabled by default)  |       -1       |\n\nThese configuration parameters can be overridden at runtime by specifying a VM args of the form: `-Djunitperf.\u003cparam\u003e=X`\n\ni.e. To set a test duration of 10 mins at runtime, specify `-Djunitperf.durationMs=600000`.\nThis will override the `durationMs` set in the `@JUnitPerfTest` annotation.\n\n**NOTE:** Do not use \"_\" when defining runtime integer or long override values, i.e. use `600000` and not `600_000`\n\n\u003cbr /\u003e\n\n`@JUnitPerfTestRequirement` has the following configuration parameters:\n\n| Property               | Definition                                                                                                                    |  Default value  |\n|:-----------------------|:------------------------------------------------------------------------------------------------------------------------------|:---------------:|\n| percentiles            | Comma separated list of ms percentile targets, format: percentile1:limit,percentile2:limit (ie. 90:3.3,99:6.8)                |       \"\"        |\n| executionsPerSec       | Target executions per second                                                                                                  |        1        |\n| allowedErrorPercentage | Allowed % of errors (uncaught exceptions) during test execution (value between 0 and 1, where 1 = 100% errors allowed)        |        0        |\n| minLatency             | Expected minimum latency in ms, if minimum latency is above this value, test will fail                                        |    disabled     |\n| maxLatency             | Expected maximum latency in ms, if maximum latency is above this value, test will fail                                        |    disabled     |\n| meanLatency            | Expected mean latency in ms, if mean latency is above this value, test will fail                                              |    disabled     |\n\n\u003cbr /\u003e\n\n## Reports\n\n[HTML Reports](#html-reports)\n\n[Console Reporting](#console-reporting)\n\n[CSV Reporting](#csv-reporting)\n\n\u003cbr /\u003e\n\n#### HTML Reports\n\nAn example Html report can be seen below:\n\n![HTML Report](https://raw.githubusercontent.com/noconnor/JUnitPerf/master/docs/common/images/example_report.png \"Example JUnitPerf html report\")\n\nHovering over the datapoints on the percentile latency graph will provide latency/percentile information.\n\nThe HTML reporter will generate an HTML performance report under `${BUILD_DIR}/reports/junitperf_report.html`\n\nIt is possible to override the template by placing a customised src/main/resources/templates/report.template file on the classpath ahead of the default template.\n\n\u003cbr /\u003e\n\n#### Console Reporting\n\nIt is also possible to use one of the other built-in reporters, the console reporter. for example:\n\nExample output:\n\n```\n15:55:06.575 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Started at:   2017-10-28 15:55:05\n15:55:06.580 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Invocations:  765\n15:55:06.580 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator -   - Success:  765\n15:55:06.580 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator -   - Errors:   0\n15:55:06.580 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator -   - Errors:   0.0% - PASSED\n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - \n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Thread Count: 1\n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Warm up:      0ms\n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - \n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Execution time: 1000ms\n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Throughput:     766/s (Required: 10000/s) - FAILED!!\n15:55:06.581 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Min. latency:   1.012392ms\n15:55:06.582 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Max latency:    3.74209ms\n15:55:06.582 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - Ave latency:    1.2975845ms\n15:55:06.583 [main] INFO  c.g.n.j.r.p.ConsoleReportGenerator - \n```\n\n\u003cbr /\u003e\n\n#### CSV Reporting\n\nIt is also possible to use the built-in CSV reporter.\nThe CSV reporter will generate a CSV file at the default location `${BUILD_DIR}/reports/junitperf_report.csv`.\n\nThe CSV output will have the following format:\n\n```\ntestName,duration,threadCount,throughput,minLatencyMs,maxLatencyMs,meanLatencyMs,percentileData\nunittest1,10000,50,101,500000.0,1.430,6.430,1:0.0;2:0.0;3:0.0;4:0.0;5:0.0; ... ;98:4.03434;99:4.83434680\n```\n\nNOTE: the percentileData is formatted as ```percentile1:latency;percentile2:latency; ...```\n\n\n\u003cbr /\u003e\n\n\n## Statistics\n\nBy default, statistics are captured and calculated using the apache [Descriptive Statistics library](http://commons.apache.org/proper/commons-math/userguide/stat.html#a1.2_Descriptive_statistics).\nSee [DescriptiveStatisticsCalculator](junitperf-core/src/main/java/com/github/noconnor/junitperf/statistics/providers/DescriptiveStatisticsCalculator.java) for more details.\n\nThe default statistics calculator has an \"infinite\" size sampling window.\nAs a result, long-running tests may require a lot of memory to hold all test samples.\nThe window size may be set to a fixed size as follows : `new DescriptiveStatisticsCalculator(1_000_000)` \n\n\n\u003cbr /\u003e\n\n## Build Instructions\n\nTo compile this project and run tests execute the following command from the root project directory: ` mvn clean test  -Dgpg.skip`\n\nTo generate a library jar execute: `mvn clean package -Dgpg.skip` \n\n**Intellij 14 Setup**\n\nTo run/add to this project using intellij you will require the following plugins:\n\n* [Lombok](https://plugins.jetbrains.com/plugin/6317)\n* CodeStyle Formatter\n\u003cbr /\u003e\nTo configure your IntelliJ settings to use this formatter:\n    * IntelliJ IDEA \u003e Preferences \u003e Editor \u003e Code Style \u003e Scheme \u003e Project (Apply Settings)\n\nTo resolve issues with lombok annotations not being compiled during a module make try setting the following preference:\n\n* Go to the preferences (settings) menu\n* Search for the \"Compiler\" section in the dialog window and then go to the \"Annotation Processors\" subsection\n* Tick the checkbox reading \"Enable annotation processing\"\n\n\u003cbr /\u003e\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnoconnor%2Fjunitperf","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnoconnor%2Fjunitperf","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnoconnor%2Fjunitperf/lists"}