Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/colinmollenhour/magento-cache-benchmark
Benchmarking script for comparing performance of Magento cache backends.
https://github.com/colinmollenhour/magento-cache-benchmark
Last synced: 16 days ago
JSON representation
Benchmarking script for comparing performance of Magento cache backends.
- Host: GitHub
- URL: https://github.com/colinmollenhour/magento-cache-benchmark
- Owner: colinmollenhour
- Created: 2011-09-15T00:00:29.000Z (about 13 years ago)
- Default Branch: master
- Last Pushed: 2012-11-09T03:32:15.000Z (almost 12 years ago)
- Last Synced: 2024-10-04T11:32:22.729Z (about 1 month ago)
- Language: PHP
- Homepage: http://colin.mollenhour.com
- Size: 119 KB
- Stars: 78
- Watchers: 15
- Forks: 16
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Magento cache backend benchmark
This script was forked from the benchmark.php in [Vinai's Symlink-Cache](http://github.com/Vinai/Symlink-Cache) module.
Thanks Vinai!## INSTALLATION
If you've never used modman before, download and place [modman](http://code.google.com/p/module-manager/)
in your PATH, and then from the root of your Magento installation run:modman init
Then:modman clone git://github.com/colinmollenhour/magento-cache-benchmark.git
## USAGE
php shell/cache-benchmark.php init
bash var/cachebench/default/run.sh## FEATURES
* Flexible dataset generation via options to init command
* Repeatable tests. Dataset is written to static files so the same test can be repeated, even with different backends.
* Test datasets can easily be zipped up and copied to different environments or shared.
* Can easily test multiple pre-generated datasets.
* Supports multi-process benchmarking, each process with a different set of random operations.
* Cache record data size, number of tags, expiration, popularity and volatility are all randomized.## EXAMPLE RUN
Cache Backend: Zend_Cache_Backend_Redis
Loading 'default' test data...
Loaded 10000 cache records in 16.9125 seconds. Data size is 5009.0K
Analyzing current cache contents...
Counted 10021 cache IDs and 2005 cache tags in 0.2560 seconds
Benchmarking getIdsMatchingTags...
Average: 0.00039 seconds (36.82 ids per tag)
Benchmarking 4 concurrent clients, each with 100000 operations.
4 concurrent clients completed in 64 seconds
| reads| writes| cleans
------------------------------------
Client 2| 1680.80| 313.59| 380.58
Client 1| 1681.22| 318.17| 292.41
Client 3| 1664.77| 316.60| 311.62
Client 0| 1650.93| 259.28| 361.04
------------------------------------
ops/sec | 6677.72| 1207.64| 1345.65## CLI HELP
Usage: php -f shell/cache-benchmark.php [command] [options]
Commands:
init [options] Initialize a new dataset.
load --name Load an existing dataset.
clean Flush the cache backend.
tags Benchmark getIdsMatchingTags method.
ops [options] Execute a pre-generated set of operations on the existing cache.'init' options:
--name A unique name for this dataset (default to "default")
--keys Number of cache keys (default to 10000)
--tags Number of cache tags (default to 2000)
--min-tags The min number of tags to use for each record (default 0)
--max-tags The max number of tags to use for each record (default 15)
--min-rec-size The smallest size for a record (default 1)
--max-rec-size The largest size for a record (default 1024)
--clients The number of clients for multi-threaded testing (defaults to 4)
--seed The random number generator seed (default random)'ops' options:
--name The dataset to use (from the --name option from init command)
--client Client number (0-n where n is --clients option from init command)
-q|--quiet Be less verbose.