{"id":15578200,"url":"https://github.com/groupon/node-cached","last_synced_at":"2025-04-09T16:10:40.298Z","repository":{"id":14265175,"uuid":"16972976","full_name":"groupon/node-cached","owner":"groupon","description":"A simple caching library for node.js, inspired by the Play cache API","archived":false,"fork":false,"pushed_at":"2023-01-08T00:50:34.000Z","size":397,"stargazers_count":94,"open_issues_count":13,"forks_count":14,"subscribers_count":9,"default_branch":"main","last_synced_at":"2024-04-14T12:24:37.681Z","etag":null,"topics":["cache","javascript","memcached","nodejs"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/groupon.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2014-02-19T03:46:56.000Z","updated_at":"2023-12-02T03:53:56.000Z","dependencies_parsed_at":"2023-01-13T17:51:28.423Z","dependency_job_id":null,"html_url":"https://github.com/groupon/node-cached","commit_stats":null,"previous_names":[],"tags_count":23,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/groupon%2Fnode-cached","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/groupon%2Fnode-cached/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/groupon%2Fnode-cached/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/groupon%2Fnode-cached/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/groupon","download_url":"https://codeload.github.com/groupon/node-cached/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248065284,"owners_count":21041872,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cache","javascript","memcached","nodejs"],"created_at":"2024-10-02T19:07:44.921Z","updated_at":"2025-04-09T16:10:40.269Z","avatar_url":"https://github.com/groupon.png","language":"JavaScript","funding_links":[],"categories":[],"sub_categories":[],"readme":"[![nlm-github](https://img.shields.io/badge/github-groupon%2Fnode--cached%2Fissues-F4D03F?logo=github\u0026logoColor=white)](https://github.com/groupon/node-cached/issues)\n![nlm-node](https://img.shields.io/badge/node-%3E%3D10.13.0-blue?logo=node.js\u0026logoColor=white)\n![nlm-version](https://img.shields.io/badge/version-6.1.0-blue?logo=version\u0026logoColor=white)\n[![Publish to NPM](https://github.com/groupon/node-cached/actions/workflows/npm-publish.yml/badge.svg?event=push)](https://github.com/groupon/node-cached/actions/workflows/npm-publish.yml)\n# cached\n\nA simple caching library, inspired by the [Play cache API](http://www.playframework.com/documentation/2.2.x/ScalaCache) \nand biased towards [showing stale data instead of dog piling](http://highscalability.com/strategy-break-memcache-dog-pile).\nThe interface only exposes very limited functionality, there's no multi-get or deletion of cached data.\nThe library is designed to support different caching backends, though right now only memcached is implemented.\n\nIt supports both promise- and callback-based usage.\n\n## Install\n\n`npm install --save cached`\n\n## Usage\n\nMore detailed API docs are in the next section.\n\n### Getting and setting\n\n```js\nconst cached = require('cached');\n\nconst kittens = cached('kittens');\n\nasync function cacheKittens() {\n\n  // Set a key using a plain value\n  await kittens.set('my.key', 'Hello World');\n  \n  // Set a key using a lazily created promise\n  await kittens.set('my.key', () =\u003e {\n    return cache.get('other.key');\n  });\n  \n  // Set a key using a callback-style function\n  await kittens.set('my.key', cached.deferred(done =\u003e {\n    done(null, 'Hello World');\n  }));\n  \n  const data = await kittens.getOrElse('my.key', () =\u003e {\n    // This will store \"Hello World\" for key \"my.key\" if\n    // \"my.key\" was not found\n    return 'Hello World';\n  });\n  \n  // Handle it the promise way\n  let res;\n  try {\n    res = await kittens.get('my.key')\n  } catch (e) {\n    /* ... */\n  }\n}\n```\n\n## Supported backends\n\n### Memory\n\nStores all the data in an in-memory object. This backend is set as default.\n\n*__Caveat:__ `get()` will return a reference to the stored value. Mutating the returned value will affect the\nvalue in the cache.*\n\n### Memcached\n\nA thin wrapper around [memcached-elasticache](https://github.com/jkehres/memcached-elasticache).\nYou can either provide a readily configured client, or a combination of hosts and additional options.\nWithout any additional options it will default to a local memcached on `11211`.\n\n#### Custom client instance\n\n```js\nconst Memcached = require('memcached-elasticache');\n\ncached('myStuff', { backend: {\n  type: 'memcached',\n  client: new Memcached('192.168.0.102:11212', { poolSize: 15 }),\n}});\n```\n\n#### Let `cached` create the instance\n\nThis will create the same cache as above.\n\n```js\ncached('myStuff', { backend: {\n  type: 'memcached',\n  hosts: '192.168.0.102:11212',\n  poolSize: 15,\n}});\n```\n\n#### Example\n\n```js\ncached('myStuff', { backend: {\n  type: 'memory',\n}});\n```\n\n## API\n\n### cached(name: string, options) -\u003e Cache\n\nCreates a new named cache or returns a previously initialized cache.\n\n* **name:** (required) A meaningful name for what is in the cache. This will also be used as a key-prefix. If the \n  name is `\"cars\"`, all keys will be prefixed with `\"cars:\"`\n* **options:** (optional)\n  * **backend:** An object that has at least a `type` property. If no backend is configured, the cache will run in \n    \"noop\"-mode, not caching anything. All other properties are forwarded to the backend, see \n    [using different backends](#supported-backends) for which backend types exist and what options they support.\n  * **defaults:** Defaults to apply for all cache operations. See `Cache.setDefaults`\n\n### cached.createCache(options) -\u003e Cache\n\nThis allows you to circumvent the global named caches. The options are the same as above, just `name` is also part \nof the `options` object when using this function.\n\n### cached.dropNamedCache(name: string) -\u003e cached\n\nDrop the given named cache.\n\n### cached.dropNamedCaches() -\u003e cached\n\nDrop all named caches.\n\n### cached.deferred(fn) -\u003e () -\u003e Promise\n\nConvert a node-style function that takes a callback as its first parameter into a parameterless function that \ngenerates a promise. In other words: this is what you'd want to wrap your node-style functions in when using them \nas value arguments to `set` or `getOrElse`.\n\n**Example:**\n```js\nconst http = require('http');\n\nconst cache = cached('myStuff');\nconst f = cached.deferred(cb =\u003e {\n  const req = http.get(myUrl, res =\u003e {\n    cb(null, res.statusCode);\n  });\n  req.once('error', cb);\n});\n\n// f can now be called and the return value will be a promise\nf().then(function(statusCode) { console.log(statusCode); });\n\n// More importantly it can be passed into cache.set\nawait cache.set('someKey', f);\n```\n\n### Cache.setDefaults(defaults) -\u003e Cache.defaults\n\nExtends the current defaults with the provided defaults.\nThe two important ones are `freshFor` and `expire`:\n\n* `expire` is the time in seconds after which a value should be deleted from the cache \n  (or whatever expiring natively means for the backend). Usually you'd want this to be `0` (never expire).\n* `freshFor` is the time in seconds after which a value should be replaced. Replacing the value is done in \n  the background and while the new value is generated (e.g. data is fetched from some service) the stale \n  value is returned. Think of `freshFor` as a smarter `expire`.\n* `timeout` is the maximum time in milliseconds to wait for cache operations to complete.\n  Configuring a timeout ensures that all `get`, `set`, and `unset` operations fail fast.\n  Otherwise, there will be situations where one of the cache hosts goes down and reads hang for minutes while \n  the memcached client retries to establish a connection.\n  It's **highly** recommended to set a timeout.\n  If `timeout` is left `undefined`, no timeout will be set, and the operations will only fail once the \n  underlying client, e.g. [`memcached`](https://github.com/3rd-Eden/memcached), gave up.\n  \n### Cache.get(key) -\u003e Promise\\\u003cvalue\\\u003e\n\nCache retrieve operation. `key` has to be a string.\nCache misses are generally treated the same as retrieving `null`, errors should only be caused by transport \nerrors and connection problems.\nIf you want to cache `null`/`undefined` (e.g. 404 responses), you may want to wrap it or choose a different \nvalue, like `false`, to represent this condition.\n\n**Example:**\n```js\nawait cache.get('foo');\n```\n\n### Cache.getOrElse(key, value, opts) -\u003e Promise\\\u003cvalue\\\u003e\n\nThis is the function you'd want to use most of the time.\nIt takes the same arguments as `set` but it will check the cache first.\nIf a value is already cached, it will return it directly (respond as fast as possible).\nIf the value is marked as stale (generated `n` seconds ago with `n \u003e freshFor`), it will replace the value \nin the cache. When multiple `getOrElse` calls concurrently encounter the same stale value, it will only replace \nthe value once. This is done on a per-instance level, so if you create many cache instances reading and writing \nthe same keys, you are asking for trouble. If you don't, the worst case is every process in your system fetching \nthe value at once. Which should be a smaller number than the number of concurrent requests in most cases.\n\n**Examples:**\n```js\n// with a value\nconst res = await cache.getOrElse('foo', 'bar');\n\n// with a function returning a value\nconst res = await cache.getOrElse('foo', () =\u003e { return 'bar' });\n\n// with a function returning a promise\nconst res = await cache.getOrElse('foo', () =\u003e { return Promise.resolve('bar') });\n\n// with a promise function\nconst res = await cache.getOrElse('foo', async () =\u003e { return 'bar' });\n```\n\n### Cache.set(key, value, opts) -\u003e Promise\\\u003cvoid\\\u003e\n\nCache store operation. `key` has to be a string, for possible `opts` see `Cache.setDefaults`.\nThe value can be any of the following:\n\na) Anything that can be converted to JSON\u003cbr\u003e\nb) A Promise of (a)\u003cbr\u003e\nc) A function returning (a) or (b)\u003cbr\u003e\n\n**Examples:**\n```js\n// with a value\nawait cache.set('foo', 'bar');\n\n// with a function returning a value\nawait cache.set('foo', () =\u003e { return 'bar' });\n\n// with a function returning a promise\nawait cache.set('foo', () =\u003e { return Promise.resolve('bar') });\n\n// with a promise function\nawait cache.set('foo', async () =\u003e { return 'bar' });\n```\n\n### Cache.flush() -\u003e Promise\\\u003cvoid\\\u003e\n\nFlushes backend.\n\n**Example:**\n```js\nawait cache.flush()\n```\n\n### Cache.unset(key) -\u003e Promise\\\u003cvoid\\\u003e\n\nCache delete operation.\n`key` has to be a string.\n\n**Example:**\n```js\nawait cache.unset('foo');\n```\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgroupon%2Fnode-cached","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgroupon%2Fnode-cached","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgroupon%2Fnode-cached/lists"}