{"id":18621268,"url":"https://github.com/yiling-j/cacheme","last_synced_at":"2025-10-14T09:32:13.300Z","repository":{"id":57416813,"uuid":"225180485","full_name":"Yiling-J/cacheme","owner":"Yiling-J","description":"Asyncio cache framework for Python","archived":false,"fork":false,"pushed_at":"2023-06-02T20:11:40.000Z","size":474,"stargazers_count":45,"open_issues_count":3,"forks_count":0,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-07-27T07:46:55.095Z","etag":null,"topics":["asyncio","cache","framework","memory","python","redis"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Yiling-J.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2019-12-01T15:06:38.000Z","updated_at":"2025-03-06T05:59:21.000Z","dependencies_parsed_at":"2025-04-11T02:41:53.906Z","dependency_job_id":"b265cf17-cd29-41db-a439-16690e42389b","html_url":"https://github.com/Yiling-J/cacheme","commit_stats":null,"previous_names":[],"tags_count":8,"template":false,"template_full_name":null,"purl":"pkg:github/Yiling-J/cacheme","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Yiling-J%2Fcacheme","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Yiling-J%2Fcacheme/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Yiling-J%2Fcacheme/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Yiling-J%2Fcacheme/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Yiling-J","download_url":"https://codeload.github.com/Yiling-J/cacheme/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Yiling-J%2Fcacheme/sbom","scorecard":{"id":154842,"data":{"date":"2025-08-11","repo":{"name":"github.com/Yiling-J/cacheme","commit":"f402c45267ad107a647395a74e67e53760c13755"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":3.2,"checks":[{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/benchmark.yml:1","Warn: no topLevel permission defined: .github/workflows/benchmark_template.yml:1","Warn: no topLevel permission defined: .github/workflows/codeql-analysis.yml:1","Warn: no topLevel permission defined: .github/workflows/release.yml:1","Warn: no topLevel permission defined: .github/workflows/test.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Code-Review","score":0,"reason":"Found 0/29 approved changesets -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/benchmark_template.yml:63: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/benchmark_template.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/benchmark_template.yml:64: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/benchmark_template.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/benchmark_template.yml:68: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/benchmark_template.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/benchmark_template.yml:74: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/benchmark_template.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/benchmark_template.yml:87: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/benchmark_template.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/codeql-analysis.yml:33: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/codeql-analysis.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/codeql-analysis.yml:46: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/codeql-analysis.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/codeql-analysis.yml:57: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/codeql-analysis.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/codeql-analysis.yml:71: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/codeql-analysis.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/release.yml:17: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/release.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/release.yml:18: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/release.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/release.yml:22: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/release.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/release.yml:28: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/release.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:68: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/test.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:69: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/test.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/test.yml:73: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/test.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:79: update your workflow using https://app.stepsecurity.io/secureworkflow/Yiling-J/cacheme/test.yml/master?enable=pin","Info:   0 out of  13 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   4 third-party GitHubAction dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: BSD 3-Clause \"New\" or \"Revised\" License: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":-1,"reason":"internal error: error during branchesHandler.setup: internal error: githubv4.Query: Resource not accessible by integration","details":null,"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"Vulnerabilities","score":0,"reason":"15 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-3ww4-gg4f-jr7f","Warn: Project is vulnerable to: GHSA-5cpq-8wj7-hf2v","Warn: Project is vulnerable to: PYSEC-2024-225 / GHSA-6vqw-3v5j-54x4","Warn: Project is vulnerable to: GHSA-9v9h-cgj8-h64p","Warn: Project is vulnerable to: GHSA-h4gh-qq45-vh27","Warn: Project is vulnerable to: PYSEC-2023-254 / GHSA-jfhm-5ghh-2f97","Warn: Project is vulnerable to: GHSA-jm77-qphf-c4w8","Warn: Project is vulnerable to: GHSA-v8gr-m533-ghj9","Warn: Project is vulnerable to: GHSA-3rq5-2g8h-59hc","Warn: Project is vulnerable to: GHSA-mr82-8j83-vxmv","Warn: Project is vulnerable to: GHSA-m87m-mmvp-v9qm","Warn: Project is vulnerable to: GHSA-v9hf-5j83-6xpp","Warn: Project is vulnerable to: PYSEC-2023-45 / GHSA-24wv-mv5m-xv4h","Warn: Project is vulnerable to: PYSEC-2023-46 / GHSA-8fww-64cx-x8p5","Warn: Project is vulnerable to: GHSA-jfmj-5v4g-7637"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}},{"name":"SAST","score":7,"reason":"SAST tool detected but not run on all commits","details":["Info: SAST configuration detected: CodeQL","Warn: 0 commits out of 3 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}}]},"last_synced_at":"2025-08-16T11:29:02.210Z","repository_id":57416813,"created_at":"2025-08-16T11:29:02.210Z","updated_at":"2025-08-16T11:29:02.210Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279018630,"owners_count":26086404,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-14T02:00:06.444Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["asyncio","cache","framework","memory","python","redis"],"created_at":"2024-11-07T04:10:06.271Z","updated_at":"2025-10-14T09:32:13.284Z","avatar_url":"https://github.com/Yiling-J.png","language":"Python","readme":"# Cacheme\n\nAsyncio cache framework with multiple cache storages.\n\n- **Organize cache better:** Cache configuration with node, you can apply different strategies on different nodes.\n- **Multiple cache storages:** in-memory/redis/mongodb/postgres..., also support chain storages.\n- **Multiple serializers:** Pickle/Json/Msgpack serializers.\n- **Thundering herd protection:** Simultaneously requests to same key are blocked by asyncio Event and only load from source once.\n- **Cache stats API:** Stats of each node and colected automatically.\n- **Performance:** See Benchemarks section.\n\nRelated projects:\n- High performance in-memory cache: https://github.com/Yiling-J/theine\n\n## Table of Contents\n\n- [Installation](#installation)\n- [Add Node](#add-node)\n- [Register Storage](#register-storage)\n- [Cacheme API](#cacheme-api)\n- [Cache Node](#cache-node)\n    + [Key](#key)\n    + [Meta Class](#meta-class)\n    + [Serializers](#serializers)\n    + [DoorKeeper](#doorkeeper)\n- [Cache Storage](#cache-storage)\n    + [Local Storage](#local-storage)\n    + [Redis Storage](#redis-storage)\n    + [MongoDB Storage](#mongodb-storage)\n    + [Sqlite Storage](#sqlite-storage)\n    + [PostgreSQL Storage](#postgresql-storage)\n    + [MySQL Storage](#mysql-storage)\n- [How Thundering Herd Protection Works](#how-thundering-herd-protection-works)\n- [Benchmarks](#benchmarks)\n    + [continuous benchmark](#continuous-benchemark)\n    + [200k concurrent requests](#200k-concurrent-requests)\n    + [20k concurrent batch requests](#20k-concurrent-batch-requests)\n\n## Requirements\nPython 3.7+\n\n## Installation\n\n```\npip install cacheme\n```\n\nMultiple storages are supported by drivers. You can install the required drivers with:\n```\npip install cacheme[redis]\npip install cacheme[aiomysql]\npip install cacheme[motor]\npip install cacheme[asyncpg]\n```\n\n## Add Node\nNode is the core part of cache. Each node has its own key function, load function and storage options. Stats of each node are collected independently. You can place all node definations into one package/module, so everyone knows exactly what is cached and how they are cached. All cacheme API are based on node.\n\nEach node contains:\n- Key attritubes and `key` method,  which are used to generate cache key. Here the `UserInfoNode` is a dataclass, so `__init__` method is generated automatically.\n- Async `load` method, which will be called to load data from data source on cache missing. This method can be omitted if you use `Memoize` decorator only.\n- `Meta` class, node cache configurations. See [Cache Node](#cache-node)\n\n```python\nimport cacheme\nfrom dataclasses import dataclass\nfrom cacheme.serializer import MsgPackSerializer\n\n@dataclass\nclass UserInfoNode(cacheme.Node):\n    user_id: int\n\n    def key(self) -\u003e str:\n        return f\"user:{self.user_id}:info\"\n\n    async def load(self) -\u003e Dict:\n        user = get_user_from_db(self.user_id)\n        return serialize(user)\n\n    class Meta(cacheme.Node.Meta):\n        version = \"v1\"\n        caches = [cacheme.Cache(storage=\"my-redis\", ttl=None)]\n        serializer = MsgPackSerializer()\n```\nThis simple example use a cache storage called \"my-redis\", which will be registered next step. Also we use `MsgPackSerializer` here to dump and load data from redis. See [Cache Node] for more details.\n\n## Register Storage\n\nRegister a redis storage called \"my-redis\", which you can use in node meta data. The `register_storage` is asynchronous and will try to establish connection to cache store.\nSee [Cache Storage] for more details.\n\n```python\nimport cacheme\n\nawait cacheme.register_storage(\"my-redis\", cacheme.Storage(url=\"redis://localhost:6379\"))\n```\n\n## Cacheme API\n\n`get`: get data from single node.\n```python\nuser = await cacheme.get(UserInfoNode(user_id=1))\n```\n\n`get_all`: get data from multiple nodes, same node type.\n```python\nusers = await cacheme.get_all([UserInfoNode(user_id=1), UserInfoNode(user_id=2)])\n```\n\n`invalidate`: invalidate a node, remove data from cache.\n```python\nawait cacheme.invalidate(UserInfoNode(user_id=1))\n```\n\n`refresh`: reload node data using `load` method.\n```python\nawait cacheme.refresh(UserInfoNode(user_id=1))\n```\n\n`Memoize`: memoize function with this decorator.\n\nDecorate your function with `cacheme.Memoize` decorator and cache node. Cacheme will load data using the decorated function and ignore `load` method.\nBecause your function may contain variable number of args/kwargs, we need one more step to map between args/kwargs to node. The decorated map function should have same input signature as memoized function, and return a cache node.\n\n```python\n@cacheme.Memoize(UserInfoNode)\nasync def get_user_info(user_id: int) -\u003e Dict:\n    return {}\n\n# function name is not important, so just use _ here\n@get_user_info.to_node\ndef _(user_id: int) -\u003e UserInfoNode:\n    return UserInfoNode(user_id=user_id)\n```\n\n`nodes`: list all nodes.\n```python\nnodes = cacheme.nodes()\n```\n\n`stats`: get node stats.\n```\nmetrics = cacheme.stats(UserInfoNode)\n\nmetrics.request_count() # total request count\nmetrics.hit_count() # total hit count\nmetrics.hit_rate() # hit_count/request_count\nmetrics.miss_count() # (request_count - hit_count)/request_count\nmetrics.miss_rate() # miss_count/request_count\nmetric.load_success_count() # total load success count\nmetrics.load_failure_count() # total load fail count\nmetrics.load_failure_rate() # load_failure_count/load_count\nmetrics.load_count() # total load count\nmetrics.total_load_time() # total load time in nanoseconds\nmetrics.average_load_time() # total_load_time/load_count\n```\n\n`set_prefix`: set prefix for all keys. Default prefix is `cacheme`. Change prefix will invalid all keys, because prefix is part of the key.\n```python\ncacheme.set_prefix(\"mycache\")\n```\n\n\n## Cache Node\n\n#### Key\nGenerated cache key will be: `{prefix}:{key()}:{Meta.version}`. So change `version` will invalid all keys automatically.\n\n#### Meta Class\n- `version[str]`: Version of node, will be used as suffix of cache key.\n- `caches[List[Cache]]`: Caches for node. Each `Cache` has 2 attributes, `storage[str]` and `ttl[Optional[timedelta]]`. `storage` is the name you registered with `register_storage` and `ttl` is how long this cache will live. Cacheme will try to get data from each cache from left to right. In most cases, use single cache or [local, remote] combination.\n- `serializer[Optional[Serializer]]`: Serializer used to dump/load data. If storage type is `local`, serializer is ignored. See [Serializers](#serializers).\n- `doorkeeper[Optional[DoorKeeper]]`: See [DoorKeeper](#doorkeeper).\n\nMultiple caches example. Local cache is not synchronized, so set a much shorter ttl compared to redis one. Then we don't need to worry too much about stale data.\n\n```python\nimport cacheme\nfrom dataclasses import dataclass\nfrom datetime import timedelta\nfrom cacheme.serializer import MsgPackSerializer\n\n@dataclass\nclass UserInfoNode(cacheme.Node):\n    user_id: int\n\n    def key(self) -\u003e str:\n        return f\"user:{self.user_id}:info\"\n\n    async def load(self) -\u003e Dict:\n        user = get_user_from_db(self.user_id)\n        return serialize(user)\n\n    class Meta(cacheme.Node.Meta):\n        version = \"v1\"\n        caches = [\n            cacheme.Cache(storage=\"local\", ttl=timedelta(seconds=30)),\n            cacheme.Cache(storage=\"my-redis\", ttl=timedelta(days=10))\n        ]\n        serializer = MsgPackSerializer()\n```\n\nCacheme also support creating Node dynamically, you can use this together with `Memoize` decorator:\n\n```python\n@Memoize(cacheme.build_node(\"TestNodeDynamic\", \"v1\", [Cache(storage=\"local\", ttl=None)]))\nasync def fn(a: int) -\u003e int:\n    return 1\n\n\n@fn.to_node\ndef _(a: int) -\u003e cacheme.DynamicNode:\n    return DynamicNode(key=f\"bar:{a}\")\n```\nHere we use `DynamicNode`, which only support one param: `key`\n\n#### Serializers\nCacheme provides serveral builtin serializers, you can also write your own serializer.\n\n- `PickleSerializer`: All python objects.\n- `JSONSerializer`: Use `pydantic_encoder` and `json`, support python primitive types, dataclass, pydantic model. See [pydantic types](https://docs.pydantic.dev/usage/types/).\n- `MsgPackSerializer`: Use `pydantic_encoder` and `msgpack`, support python primitive types, dataclass, pydantic model. See [pydantic types](https://docs.pydantic.dev/usage/types/).\n\nserializer with compression, use zlib level-3\n\n- `CompressedPickleSerializer`\n- `CompressedJSONSerializer`\n- `CompressedMsgPackSerializer`\n\n#### DoorKeeper\nIdea from [TinyLfu paper](https://arxiv.org/pdf/1512.00727.pdf).\n\n*The Doorkeeper is a regular Bloom filter placed in front of the cahce. Upon\nitem arrival, we first check if the item is contained in the Doorkeeper. If it is not contained in the\nDoorkeeper (as is expected with first timers and tail items), the item is inserted to the Doorkeeper and\notherwise, it is inserted to the cache.*\n\n```python\nfrom cacheme import BloomFilter\n\n@dataclass\nclass UserInfoNode(cacheme.Node):\n\n    class Meta(cacheme.Node.Meta):\n        # size 100000, false positive probability 0.01\n        doorkeeper = BloomFilter(100000, 0.01)\n```\nBloomFilter is cleared automatically when requests count == size.\n\n\n## Cache Storage\n\n#### Local Storage\nLocal storage use the state-of-the-art library **Theine** to store data. If your use case in simple, also consider using [Theine](https://github.com/Yiling-J/theine) directly, which will have the best performance.\n\n```python\n# lru policy\nStorage(url=\"local://lru\", size=10000)\n\n# w-tinylfu policy\nStorage(url=\"local://tlfu\", size=10000)\n\n```\nParameters:\n\n- `url`: `local://{policy}`. 2 policies are currently supported:\n  - `lru`\n  - `tlfu`: W-TinyLfu policy\n\n- `size`: size of the storage. Policy will be used to evict key when cache is full.\n\n#### Redis Storage\n```python\nStorage(url=\"redis://localhost:6379\")\n\n# cluster\nStorage(url=\"redis://localhost:6379\", cluster=True)\n```\nParameters:\n\n- `url`: redis connection url.\n- `cluster`: bool, cluster or not, default False.\n- `pool_size`: connection pool size, default 100.\n\n#### MongoDB Storage\nTo use mongodb storage, create index first. See [mongo.js](cacheme/storages/scripts/mongo.js)\n```python\nStorage(url=\"mongodb://test:password@localhost:27017\",database=\"test\",collection=\"cache\")\n```\nParameters:\n\n- `url`: mongodb connection url.\n- `database`: mongodb database name.\n- `collection`: mongodb collection name.\n- `pool_size`: connection pool size, default 50.\n\n#### Sqlite Storage\nTo use sqlite storage, create table and index first. See [sqlite.sql](cacheme/storages/scripts/sqlite.sql)\n```python\nStorage(url=\"sqlite:///test\", table=\"cache\")\n```\nParameters:\n\n- `url`: sqlite connection url.\n- `table`: cache table name.\n- `pool_size`: connection pool size, default 50.\n\n#### PostgreSQL Storage\nTo use postgres storage, create table and index first. See [postgresql.sql](cacheme/storages/scripts/postgresql.sql)\n```python\nStorage(url=\"postgresql://username:password@127.0.0.1:5432/test\", table=\"cache\")\n```\nParameters:\n\n- `url`: postgres connection url.\n- `table`: cache table name.\n- `pool_size`: connection pool size, default 50.\n\n#### MySQL Storage\nTo use mysql storage, create table and index first. See [mysql.sql](cacheme/storages/scripts/mysql.sql)\n```python\nStorage(\"mysql://username:password@localhost:3306/test\", table=\"cache\")\n```\nParameters:\n\n- `url`: mysql connection url.\n- `table`: cache table name.\n- `pool_size`: connection pool size, default 50.\n\n## How Thundering Herd Protection Works\n\nIf you are familar with Go [singleflight](https://pkg.go.dev/golang.org/x/sync/singleflight), you may have an idea how Cacheme works. Cacheme group concurrent requests to same resource(node) into a singleflight with asyncio Event, which will **load from remote cache OR data source only once**. That's why in next Benchmarks section, you will find Cacheme even reduce total redis GET command count under high concurrency.\n\n\n## Benchmarks\n\n### continuous benchmark\nhttps://github.com/Yiling-J/cacheme-benchmark\n\n### 200k concurrent requests\n\naiocache: https://github.com/aio-libs/aiocache\n\ncashews: https://github.com/Krukov/cashews\n\nsource code: https://github.com/Yiling-J/cacheme/blob/master/benchmarks/trace.py\n\nHow this benchmark run:\n\n1. Initialize Cacheme/Aiocache/Cashews with Redis backend, use Redis blocking pool and set pool size to 100.\n2. Decorate Aiocache/Cashews/Cacheme with a function which accept a number and sleep 0.1s. This function also record how many times it is called.\n3. Register Redis response callback, so we can know how many times GET command are called.\n4. Create 200k coroutines use a zipf generator and put them in async queue(around 50k-60k unique numbers).\n5. Run coroutines in queue with N concurrent workers.\n6. Collect results.\n\nIdentifier:\n- Cacheme: Cacheme redis storage\n- Aiocahce: Aiocahce cached decorator\n- Cashews: Cashews cache decorate\n- Cacheme-2: Cacheme use cache chain [local, redis]\n- Aiocache-2: Aiocache cached_stampede decorator\n- Cashews-2: Cashews decorator with lock=True\n\nResult:\n- Time: How long it takes to finish bench.\n- Redis GET: How many times Redis GET command are called, use this to evaluate pressure to remote cache server.\n- Load Hits: How many times the load function(which sleep 0.1s) are called, use this to evaluate pressure to load source(database or something else).\n\n#### 1k concurrency\n\n|            | Time  | Redis GET  | Load Hits |\n|------------|-------|------------|-----------|\n| Cacheme    | 25 s  | 166454     | 55579     |\n| Cacheme-2  | 20 s  | 90681      | 55632     |\n| Aiocache   | 46 s  | 200000     | 56367     |\n| Aiocache-2 | 63 s  | 256492     | 55417     |\n| Cashews    | 51 s  | 200000     | 56920     |\n| Cashews-2  | 134 s | 200000     | 55450     |\n\n\n#### 10k concurrency\n\n|            | Time  | Redis GET | Load Hits |\n|------------|-------|-----------|-----------|\n| Cacheme    | 25 s  | 123704    | 56736     |\n| Cacheme-2  | 19 s  | 83750     | 56635     |\n| Aiocache   | 67 s  | 200000    | 62568     |\n| Aiocache-2 | 113 s | 263195    | 55507     |\n| Cashews    | 68 s  | 200000    | 66036     |\n| Cashews-2  | 175 s | 200000    | 55709     |\n\n\n#### 100k concurrency\n\n|            | Time  | Redis GET | Load Hits |\n|------------|-------|-----------|-----------|\n| Cacheme    | 24 s  | 60990     | 56782     |\n| Cacheme-2  | 22 s  | 55762     | 55588     |\n| Aiocache   | 80 s  | 200000    | 125085    |\n| Aiocache-2 | 178 s | 326417    | 65598     |\n| Cashews    | 88 s  | 200000    | 87894     |\n| Cashews-2  | 236 s | 200000    | 55647     |\n\n### 20k concurrent batch requests\n\nsource code: https://github.com/Yiling-J/cacheme/blob/master/benchmarks/trace.py\n\nHow this benchmark run:\n\n1. Initialize Cacheme with Redis backend, use Redis blocking pool and set pool size to 100.\n2. Decorate Cacheme with a function which accept a number and sleep 0.1s. This function also record how many times it is called.\n3. Register Redis response callback, so we can know how many times MGET command are called.\n4. Create 20k `get_all` coroutines use a zipf generator and put them in async queue(around 50k-60k unique numbers). Each `get_all` request will get 20 unique numbers in batch. So totally 400k numbers.\n5. Run coroutines in queue with N concurrent workers.\n6. Collect results.\n\nResult:\n- Time: How long it takes to finish bench.\n- Redis MGET: How many times Redis MGET command are called, use this to evaluate pressure to remote cache server.\n- Load Hits: How many times the load function(which sleep 0.1s) are called, use this to evaluate pressure to load source(database or something else).\n\n#### 1k concurrency\n\n|            | Time | Redis MGET | Load Hits |\n|------------|------|------------|-----------|\n| Cacheme    | 12 s | 9996       | 55902     |\n\n\n#### 10k concurrency\n\n|            | Time  | Redis MGET | Load Hits |\n|------------|-------|------------|-----------|\n| Cacheme    | 11 s  | 9908       | 42894     |\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fyiling-j%2Fcacheme","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fyiling-j%2Fcacheme","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fyiling-j%2Fcacheme/lists"}