{"id":18304625,"url":"https://github.com/numtel/meteor-benchmark-packages","last_synced_at":"2025-04-05T15:31:22.741Z","repository":{"id":24450946,"uuid":"27853431","full_name":"numtel/meteor-benchmark-packages","owner":"numtel","description":"Perform benchmarks while testing your Meteor packages","archived":false,"fork":false,"pushed_at":"2014-12-12T10:16:55.000Z","size":368,"stargazers_count":6,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-03-21T06:41:24.454Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/numtel.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2014-12-11T04:05:35.000Z","updated_at":"2017-05-05T17:37:23.000Z","dependencies_parsed_at":"2022-07-19T23:17:57.344Z","dependency_job_id":null,"html_url":"https://github.com/numtel/meteor-benchmark-packages","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/numtel%2Fmeteor-benchmark-packages","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/numtel%2Fmeteor-benchmark-packages/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/numtel%2Fmeteor-benchmark-packages/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/numtel%2Fmeteor-benchmark-packages/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/numtel","download_url":"https://codeload.github.com/numtel/meteor-benchmark-packages/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247358683,"owners_count":20926266,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-05T15:29:40.860Z","updated_at":"2025-04-05T15:31:22.071Z","avatar_url":"https://github.com/numtel.png","language":"JavaScript","funding_links":[],"categories":[],"sub_categories":[],"readme":"# numtel:benchmark-packages\n\nPerform benchmarks on your Meteor packages while using the `meteor --test-packages` browser interface\n\n![screenshot](screenshot.png)\n\nClick on the 'Benchmark' button added to the interface, or press `F2`, to toggle the benchmark overlay.\n\nSelect a test and specify options use JSON. Also, raw results are presented in the browser's Javascript console.\n\n* [View source for Array insert benchmark...](test/arrayInserts.js)\n\n## Installation\n\nAdd the following lines to `package.js`:\n```javascript\n\nPackage.onTest(function(api) {\n  // ...\n  api.use('tinytest');\n\n  // Add the pacakge\n  api.use('numtel:benchmark-packages@0.0.1');\n  \n  // Benchmarks are only available on the client\n  api.addFiles([\n    'test/arrayInserts.js'\n  ], 'client');\n});\n```\n\n## Defining benchmarks\n\nDefine each benchmark using the following syntax. Each case has meta values prefixed with underscore as well as methods.\n\n```javascript\nBenchmark.addCase({\n  _label: 'My test', // Human readable name\n  _value: 'rate', // (Optional, default 'rate') Value to display on graph\n  _default: { // Specify default options (must be JSON-serializable)\n    count: 50000, // (Optional, default 1) How many items to insert\n    sampleSize: 4 // (Required) Number of times to run each test\n  },\n  // Provide each method to test\n  'original-method': {\n    // The main function to be timed\n    run: function(options,done){\n      // Perform operation...\n      done();\n    },\n    reset: function(options, done){\n      // Clean up... this runs before and after 'run'\n      done();\n    }\n  },\n  'new-method': { ... }\n});\n```\n`_value` Setting | Description\n---------|---------------\n`rate`   | (Default) Operations per second (based on `count`)\n`time`   | Total time elapsed (milliseconds)\n`resetTime` | Time elapsed during clean-up\n*Custom* | Any other value except `name` or `count`\n\n* Custom time points may also be graphed. Specify a partial-completion with a string argument in the `done()` function (e.g.: `done('server')`) then use the same argument for `_value`. `done()` with no arguments must still be called later.\n* `run` and `reset` functions run asynchronously until `done()` is called.\n* `count`, `sampleSize`, and `methods` are reserved option names.\n\nWhen configuring a benchmark to run in the browser, pass a `methods` option containing an array of strings corresponding to the names of methods to test in order to run only a subset of methods in benchmark.\n\n## License\n\nMIT\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnumtel%2Fmeteor-benchmark-packages","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnumtel%2Fmeteor-benchmark-packages","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnumtel%2Fmeteor-benchmark-packages/lists"}