{"id":13424705,"url":"https://github.com/long2ice/fastapi-cache","last_synced_at":"2025-04-23T23:05:07.151Z","repository":{"id":40604484,"uuid":"290160721","full_name":"long2ice/fastapi-cache","owner":"long2ice","description":"fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcached.","archived":false,"fork":false,"pushed_at":"2025-04-18T08:47:35.000Z","size":930,"stargazers_count":1545,"open_issues_count":98,"forks_count":181,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-04-23T23:04:49.687Z","etag":null,"topics":["cache","fastapi","memcached","redis"],"latest_commit_sha":null,"homepage":"https://github.com/long2ice/fastapi-cache","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/long2ice.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"custom":["https://sponsor.long2ice.io"]}},"created_at":"2020-08-25T08:37:24.000Z","updated_at":"2025-04-23T14:03:47.000Z","dependencies_parsed_at":"2023-10-12T19:09:45.437Z","dependency_job_id":"81b5ac56-796c-42ca-b584-7f23bcfdf849","html_url":"https://github.com/long2ice/fastapi-cache","commit_stats":{"total_commits":253,"total_committers":30,"mean_commits":8.433333333333334,"dds":0.6679841897233201,"last_synced_commit":"567934300b4961a0b7d87e902ade5680b79bdb03"},"previous_names":[],"tags_count":12,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/long2ice%2Ffastapi-cache","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/long2ice%2Ffastapi-cache/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/long2ice%2Ffastapi-cache/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/long2ice%2Ffastapi-cache/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/long2ice","download_url":"https://codeload.github.com/long2ice/fastapi-cache/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250528732,"owners_count":21445516,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cache","fastapi","memcached","redis"],"created_at":"2024-07-31T00:00:58.114Z","updated_at":"2025-04-23T23:05:07.123Z","avatar_url":"https://github.com/long2ice.png","language":"Python","readme":"# fastapi-cache\n\n[![pypi](https://img.shields.io/pypi/v/fastapi-cache2.svg?style=flat)](https://pypi.org/p/fastapi-cache2)\n[![license](https://img.shields.io/github/license/long2ice/fastapi-cache)](https://github.com/long2ice/fastapi-cache/blob/main/LICENSE)\n[![CI/CD](https://github.com/long2ice/fastapi-cache/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/long2ice/fastapi-cache/actions/workflows/ci-cd.yml)\n\n## Introduction\n\n`fastapi-cache` is a tool to cache FastAPI endpoint and function results, with\nbackends supporting Redis, Memcached, and Amazon DynamoDB.\n\n## Features\n\n- Supports `redis`, `memcache`, `dynamodb`, and `in-memory` backends.\n- Easy integration with [FastAPI](https://fastapi.tiangolo.com/).\n- Support for HTTP cache headers like `ETag` and `Cache-Control`, as well as conditional `If-Match-None` requests.\n\n## Requirements\n\n- FastAPI\n- `redis` when using `RedisBackend`.\n- `memcache` when using `MemcacheBackend`.\n- `aiobotocore` when using `DynamoBackend`.\n\n## Install\n\n```shell\n\u003e pip install fastapi-cache2\n```\n\nor\n\n```shell\n\u003e pip install \"fastapi-cache2[redis]\"\n```\n\nor\n\n```shell\n\u003e pip install \"fastapi-cache2[memcache]\"\n```\n\nor\n\n```shell\n\u003e pip install \"fastapi-cache2[dynamodb]\"\n```\n\n## Usage\n\n### Quick Start\n\n```python\nfrom collections.abc import AsyncIterator\nfrom contextlib import asynccontextmanager\n\nfrom fastapi import FastAPI\nfrom starlette.requests import Request\nfrom starlette.responses import Response\n\nfrom fastapi_cache import FastAPICache\nfrom fastapi_cache.backends.redis import RedisBackend\nfrom fastapi_cache.decorator import cache\n\nfrom redis import asyncio as aioredis\n\n\n@asynccontextmanager\nasync def lifespan(_: FastAPI) -\u003e AsyncIterator[None]:\n    redis = aioredis.from_url(\"redis://localhost\")\n    FastAPICache.init(RedisBackend(redis), prefix=\"fastapi-cache\")\n    yield\n\n\napp = FastAPI(lifespan=lifespan)\n\n\n@cache()\nasync def get_cache():\n    return 1\n\n\n@app.get(\"/\")\n@cache(expire=60)\nasync def index():\n    return dict(hello=\"world\")\n```\n\n### Initialization\n\nFirst you must call `FastAPICache.init` during startup FastAPI startup; this is where you set global configuration.\n\n### Use the `@cache` decorator\n\nIf you want cache a FastAPI response transparently, you can use the `@cache`\ndecorator between the router decorator and the view function.\n\nParameter | type | default | description\n------------ | ----| --------- | --------\n`expire` | `int` |  | sets the caching time in seconds\n`namespace` | `str` | `\"\"` | namespace to use to store certain cache items\n`coder` | `Coder` | `JsonCoder` | which coder to use, e.g. `JsonCoder`\n`key_builder` | `KeyBuilder` callable | `default_key_builder` | which key builder to use\n`injected_dependency_namespace` | `str` | `__fastapi_cache` | prefix for injected dependency keywords.\n`cache_status_header` | `str` | `X-FastAPI-Cache` | Name for the header on the response indicating if the request was served from cache; either `HIT` or `MISS`.\n\nYou can also use the `@cache` decorator on regular functions to cache their result.\n\n### Injected Request and Response dependencies\n\nThe `cache` decorator injects dependencies for the `Request` and `Response`\nobjects, so that it can add cache control headers to the outgoing response, and\nreturn a 304 Not Modified response when the incoming request has a matching\n`If-Non-Match` header. This only happens if the decorated endpoint doesn't already\nlist these dependencies already.\n\nThe keyword arguments for these extra dependencies are named\n`__fastapi_cache_request` and `__fastapi_cache_response` to minimize collisions.\nUse the `injected_dependency_namespace` argument to `@cache` to change the\nprefix used if those names would clash anyway.\n\n\n### Supported data types\n\nWhen using the (default) `JsonCoder`, the cache can store any data type that FastAPI can convert to JSON, including Pydantic models and dataclasses,\n_provided_ that your endpoint has a correct return type annotation. An\nannotation is not needed if the return type is a standard JSON-supported Python\ntype such as a dictionary or a list.\n\nE.g. for an endpoint that returns a Pydantic model named `SomeModel`, the return annotation is used to ensure that the cached result is converted back to the correct class:\n\n```python\nfrom .models import SomeModel, create_some_model\n\n@app.get(\"/foo\")\n@cache(expire=60)\nasync def foo() -\u003e SomeModel:\n    return create_some_model()\n```\n\nIt is not sufficient to configure a response model in the route decorator; the cache needs to know what the method itself returns. If no return type decorator is given, the primitive JSON type is returned instead.\n\nFor broader type support, use the `fastapi_cache.coder.PickleCoder` or implement a custom coder (see below).\n\n### Custom coder\n\nBy default use `JsonCoder`, you can write custom coder to encode and decode cache result, just need\ninherit `fastapi_cache.coder.Coder`.\n\n```python\nfrom typing import Any\nimport orjson\nfrom fastapi.encoders import jsonable_encoder\nfrom fastapi_cache import Coder\n\nclass ORJsonCoder(Coder):\n    @classmethod\n    def encode(cls, value: Any) -\u003e bytes:\n        return orjson.dumps(\n            value,\n            default=jsonable_encoder,\n            option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SERIALIZE_NUMPY,\n        )\n\n    @classmethod\n    def decode(cls, value: bytes) -\u003e Any:\n        return orjson.loads(value)\n\n\n@app.get(\"/\")\n@cache(expire=60, coder=ORJsonCoder)\nasync def index():\n    return dict(hello=\"world\")\n```\n\n### Custom key builder\n\nBy default the `default_key_builder` builtin key builder is used; this creates a\ncache key from the function module and name, and the positional and keyword\narguments converted to their `repr()` representations, encoded as a MD5 hash.\nYou can provide your own by passing a key builder in to `@cache()`, or to\n`FastAPICache.init()` to apply globally.\n\nFor example, if you wanted to use the request method, URL and query string as a cache key instead of the function identifier you could use:\n\n```python\ndef request_key_builder(\n    func,\n    namespace: str = \"\",\n    *,\n    request: Request = None,\n    response: Response = None,\n    *args,\n    **kwargs,\n):\n    return \":\".join([\n        namespace,\n        request.method.lower(),\n        request.url.path,\n        repr(sorted(request.query_params.items()))\n    ])\n\n\n@app.get(\"/\")\n@cache(expire=60, key_builder=request_key_builder)\nasync def index():\n    return dict(hello=\"world\")\n```\n\n## Backend notes\n\n### InMemoryBackend\n\nThe `InMemoryBackend` stores cache data in memory and only deletes when an\nexpired key is accessed. This means that if you don't access a function after\ndata has been cached, the data will not be removed automatically.\n\n### RedisBackend\n\nWhen using the Redis backend, please make sure you pass in a redis client that does [_not_ decode responses][redis-decode] (`decode_responses` **must** be `False`, which is the default). Cached data is stored as `bytes` (binary), decoding these in the Redis client would break caching.\n\n[redis-decode]: https://redis-py.readthedocs.io/en/latest/examples/connection_examples.html#by-default-Redis-return-binary-responses,-to-decode-them-use-decode_responses=True\n\n## Tests and coverage\n\n```shell\ncoverage run -m pytest\ncoverage html\nxdg-open htmlcov/index.html\n```\n\n## License\n\nThis project is licensed under the [Apache-2.0](https://github.com/long2ice/fastapi-cache/blob/master/LICENSE) License.\n","funding_links":["https://sponsor.long2ice.io"],"categories":["Third-Party Extensions","Python","Web","Caching"],"sub_categories":["Utils"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flong2ice%2Ffastapi-cache","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flong2ice%2Ffastapi-cache","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flong2ice%2Ffastapi-cache/lists"}