{"id":13879637,"url":"https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache","last_synced_at":"2025-07-16T15:32:39.704Z","repository":{"id":43585952,"uuid":"247500320","full_name":"DmitryTsepelev/graphql-ruby-fragment_cache","owner":"DmitryTsepelev","description":"graphql-ruby plugin for caching parts of the response","archived":false,"fork":false,"pushed_at":"2024-11-09T14:50:36.000Z","size":269,"stargazers_count":204,"open_issues_count":3,"forks_count":34,"subscribers_count":4,"default_branch":"master","last_synced_at":"2024-11-21T05:07:10.060Z","etag":null,"topics":["cache","graphql","graphql-ruby","ruby"],"latest_commit_sha":null,"homepage":"","language":"Ruby","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/DmitryTsepelev.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-03-15T16:01:37.000Z","updated_at":"2024-11-09T14:50:40.000Z","dependencies_parsed_at":"2024-01-13T20:57:36.991Z","dependency_job_id":"a4ca19e6-3339-47e7-babc-c2886420549b","html_url":"https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache","commit_stats":{"total_commits":154,"total_committers":10,"mean_commits":15.4,"dds":0.2987012987012987,"last_synced_commit":"4d15a0d40d024094baa985e508abaacffec405e3"},"previous_names":[],"tags_count":43,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DmitryTsepelev%2Fgraphql-ruby-fragment_cache","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DmitryTsepelev%2Fgraphql-ruby-fragment_cache/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DmitryTsepelev%2Fgraphql-ruby-fragment_cache/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DmitryTsepelev%2Fgraphql-ruby-fragment_cache/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/DmitryTsepelev","download_url":"https://codeload.github.com/DmitryTsepelev/graphql-ruby-fragment_cache/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":225680535,"owners_count":17507152,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cache","graphql","graphql-ruby","ruby"],"created_at":"2024-08-06T08:02:27.424Z","updated_at":"2025-07-16T15:32:39.691Z","avatar_url":"https://github.com/DmitryTsepelev.png","language":"Ruby","readme":"# GraphQL::FragmentCache ![CI](https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache/actions/workflows/rspec.yml/badge.svg?branch=master) ![](https://ruby-gem-downloads-badge.herokuapp.com/graphql-fragment_cache?type=total)\n\n`GraphQL::FragmentCache` powers up [graphql-ruby](https://graphql-ruby.org) with the ability to cache response _fragments_: you can mark any field as cached and it will never be resolved again (at least, while cache is valid). For instance, the following code caches `title` for each post:\n\n```ruby\nclass PostType \u003c BaseObject\n  field :id, ID, null: false\n  field :title, String, null: false, cache_fragment: true\nend\n```\n\nYou can support my open–source work [here](https://boosty.to/dmitry_tsepelev).\n\n## Getting started\n\nAdd the gem to your Gemfile `gem 'graphql-fragment_cache'` and add the plugin to your schema class:\n\n```ruby\nclass GraphqSchema \u003c GraphQL::Schema\n  use GraphQL::FragmentCache\n\n  query QueryType\nend\n```\n\nInclude `GraphQL::FragmentCache::Object` to your base type class:\n\n```ruby\nclass BaseType \u003c GraphQL::Schema::Object\n  include GraphQL::FragmentCache::Object\nend\n```\n\nIf you're using [resolvers](https://graphql-ruby.org/fields/resolvers.html) — include the module into the base resolver as well:\n\n```ruby\nclass Resolvers::BaseResolver \u003c GraphQL::Schema::Resolver\n  include GraphQL::FragmentCache::ObjectHelpers\nend\n```\n\nNow you can add `cache_fragment:` option to your fields to turn caching on:\n\n```ruby\nclass PostType \u003c BaseObject\n  field :id, ID, null: false\n  field :title, String, null: false, cache_fragment: true\nend\n```\n\nAlternatively, you can use `cache_fragment` method inside resolver methods:\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n  end\n\n  def post(id:)\n    cache_fragment { Post.find(id) }\n  end\nend\n```\n\n## Cache key generation\n\nCache keys consist of the following parts: namespace, implicit key, and explicit key.\n\n### Cache namespace\n\nThe namespace is prefixed to every cached key. The default namespace is `graphql`, which is configurable:\n\n```ruby\nGraphQL::FragmentCache.namespace = \"graphql\"\n```\n\n### Implicit cache key\n\nImplicit part of a cache key contains the information about the schema and the current query. It includes:\n\n- Hex gsdigest of the schema definition (to make sure cache is cleared when the schema changes).\n- The current query fingerprint consisting of a _path_ to the field, arguments information and the selections set.\n\nLet's take a look at the example:\n\n```ruby\nquery = \u003c\u003c~GQL\n  query {\n    post(id: 1) {\n      id\n      title\n      cachedAuthor {\n        id\n        name\n      }\n    }\n  }\nGQL\n\nschema_cache_key = GraphqSchema.schema_cache_key\n\npath_cache_key = \"post(id:1)/cachedAuthor\"\nselections_cache_key = \"[#{%w[id name].join(\".\")}]\"\n\nquery_cache_key = Digest::SHA1.hexdigest(\"#{path_cache_key}#{selections_cache_key}\")\n\ncache_key = \"#{schema_cache_key}/#{query_cache_key}/#{object_cache_key}\"\n```\n\nYou can override `schema_cache_key`, `query_cache_key`, `path_cache_key` or `object_cache_key` by passing parameters to the `cache_fragment` calls:\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n  end\n\n  def post(id:)\n    cache_fragment(query_cache_key: \"post(#{id})\") { Post.find(id) }\n  end\nend\n```\n\nOverriding `path_cache_key` might be helpful when you resolve the same object nested in multiple places (e.g., `Post` and `Comment` both have `author`), but want to make sure cache will be invalidated when selection set is different.\n\nSame for the option:\n\n```ruby\nclass PostType \u003c BaseObject\n  field :id, ID, null: false\n  field :title, String, null: false, cache_fragment: {query_cache_key: \"post_title\"}\nend\n```\n\nOverriding `object_cache_key` is helpful in the case where the value that is cached is different than the one used as a key, like a database query that is pre-processed before caching.\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n  end\n\n  def post(id:)\n    query = Post.where(\"updated_at \u003c ?\", Time.now - 1.day)\n    cache_fragment(object_cache_key: query.cache_key) { query.some_process }\n  end\nend\n```\n\n### Query arguments processing\n\nYou can influence the way that graphql arguments are include in the cache key.\n\nA use case might be a `:renew_cache` parameter that can be used to force a cache rewrite,\nbut should not be included with the cache key itself. Use `cache_key: { exclude_arguments: […]}`\nto specify a list of arguments to be excluded from the implicit cache key.\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n    argument :renew_cache, Boolean, required: false\n  end\n\n  def post(id:, renew_cache: false)\n    if renew_cache\n      context.scoped_set!(:renew_cache, true)\n    end\n    cache_fragment(cache_key: {exclude_arguments: [:renew_cache]}) { Post.find(id) }\n  end\nend\n```\n\nLikewise, you can use `cache_key: { include_arguments: […] }` to specify an allowlist of arguments\nto be included in the cache key. In this case all arguments for the cache key must be specified, including\nparent arguments of nested fields.\n\n### User-provided cache key (custom key)\n\nIn most cases you want your cache key to depend on the resolved object (say, `ActiveRecord` model). You can do that by passing an argument to the `#cache_fragment` method in a similar way to Rails views [`#cache` method](https://guides.rubyonrails.org/caching_with_rails.html#fragment-caching):\n\n```ruby\ndef post(id:)\n  post = Post.find(id)\n  cache_fragment(post) { post }\nend\n```\n\nYou can pass arrays as well to build a compound cache key:\n\n```ruby\ndef post(id:)\n  post = Post.find(id)\n  cache_fragment([post, current_account]) { post }\nend\n```\n\nYou can omit the block if its return value is the same as the cached object:\n\n```ruby\n# the following line\ncache_fragment(post)\n# is the same as\ncache_fragment(post) { post }\n```\n\nUsing literals: Even when using a same string for all queries, the cache changes per argument and per selection set (because of the query_key).\n\n```ruby\ndef post(id:)\n  cache_fragment(\"find_post\") { Post.find(id) }\nend\n```\n\nCombining with options:\n\n```ruby\ndef post(id:)\n  cache_fragment(\"find_post\", expires_in: 5.minutes) { Post.find(id) }\nend\n```\n\nDynamic cache key:\n\n```ruby\ndef post(id:)\n  last_updated_at = Post.select(:updated_at).find_by(id: id)\u0026.updated_at\n  cache_fragment(last_updated_at, expires_in: 5.minutes) { Post.find(id) }\nend\n```\n\nNote the usage of `.select(:updated_at)` at the cache key field to make this verifying query as fastest and light as possible.\n\nYou can also add touch options for the belongs_to association e.g author's `belongs_to: :post` to have a `touch: true`.\nSo that it invalidates the Post when the author is updated.\n\nWhen using `cache_fragment:` option, it's only possible to use the resolved value as a cache key by setting:\n\n```ruby\nfield :post, PostType, null: true, cache_fragment: {cache_key: :object} do\n  argument :id, ID, required: true\nend\n\n# this is equal to\ndef post(id:)\n  cache_fragment(Post.find(id))\nend\n```\n\nAlso, you can pass `:value` to the `cache_key:` argument to use the returned value to build a key:\n\n```ruby\nfield :post, PostType, null: true, cache_fragment: {cache_key: :value} do\n  argument :id, ID, required: true\nend\n\n# this is equal to\ndef post(id:)\n  post = Post.find(id)\n  cache_fragment(post) { post }\nend\n```\n\nIf you need more control, you can set `cache_key:` to any custom code:\n\n```ruby\nfield :posts,\n  Types::Objects::PostType.connection_type,\n  cache_fragment: {cache_key: -\u003e { object.posts.maximum(:created_at) }}\n```\n\nThe way cache key part is generated for the passed argument is the following:\n\n- Use `object_cache_key: \"some_cache_key\"` if passed to `cache_fragment`\n- Use `#graphql_cache_key` if implemented.\n- Use `#cache_key` (or `#cache_key_with_version` for modern Rails) if implemented.\n- Use `self.to_s` for _primitive_ types (strings, symbols, numbers, booleans).\n- Raise `ArgumentError` if none of the above.\n\n### Context cache key\n\nBy default, we do not take context into account when calculating cache keys. That's because caching is more efficient when it's _context-free_.\n\nHowever, if you want some fields to be cached per context, you can do that either by passing context objects directly to the `#cache_fragment` method (see above) or by adding a `context_key` option to `cache_fragment:`.\n\nFor instance, imagine a query that allows the current user's social profiles:\n\n```gql\nquery {\n  socialProfiles {\n    provider\n    id\n  }\n}\n```\n\nYou can cache the result using the context (`context[:user]`) as a cache key:\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :social_profiles, [SocialProfileType], null: false, cache_fragment: {context_key: :user}\n\n  def social_profiles\n    context[:user].social_profiles\n  end\nend\n```\n\nThis is equal to using `#cache_fragment` the following way:\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :social_profiles, [SocialProfileType], null: false\n\n  def social_profiles\n    cache_fragment(context[:user]) { context[:user].social_profiles }\n  end\nend\n```\n\n## Conditional caching\n\nUse the `if:` (or `unless:`) option:\n\n```ruby\ndef post(id:)\n  cache_fragment(if: current_user.nil?) { Post.find(id) }\nend\n\n# or\n\nfield :post, PostType, cache_fragment: {if: -\u003e { current_user.nil? }} do\n  argument :id, ID, required: true\nend\n\n# or\n\nfield :post, PostType, cache_fragment: {if: :current_user?} do\n  argument :id, ID, required: true\nend\n```\n\n## Default options\n\nYou can configure default options that will be passed to all `cache_fragment`\ncalls and `cache_fragment:` configurations. For example:\n\n```ruby\nGraphQL::FragmentCache.configure do |config|\n  config.default_options = {\n    expires_in: 1.hour, # Expire cache keys after 1 hour\n    schema_cache_key: nil # Do not clear the cache on each schema change\n  }\nend\n```\n\n## Renewing the cache\n\nYou can force the cache to renew during query execution by adding\n`renew_cache: true` to the query context:\n\n```ruby\nMyAppSchema.execute(\"query { posts { title } }\", context: {renew_cache: true})\n```\n\nThis will treat any cached value as missing even if it's present, and store\nfresh new computed values in the cache. This can be useful for cache warmers.\n\n## Cache storage and options\n\nIt's up to your to decide which caching engine to use, all you need is to configure the cache store:\n\n```ruby\nGraphQL::FragmentCache.configure do |config|\n  config.cache_store = MyCacheStore.new\nend\n```\n\nOr, in Rails:\n\n```ruby\n# config/application.rb (or config/environments/\u003cenvironment\u003e.rb)\nRails.application.configure do |config|\n  # arguments and options are the same as for `config.cache_store`\n  config.graphql_fragment_cache.store = :redis_cache_store\nend\n```\n\n⚠️ Cache store must implement `#read(key)`, `#exist?(key)` and `#write_multi(hash, **options)` or `#write(key, value, **options)` methods.\n\nThe gem provides only in-memory store out-of-the-box (`GraphQL::FragmentCache::MemoryStore`). It's used by default.\n\nYou can pass store-specific options to `#cache_fragment` or `cache_fragment:`. For example, to set expiration (assuming the store's `#write` method supports `expires_in` option):\n\n```ruby\nclass PostType \u003c BaseObject\n  field :id, ID, null: false\n  field :title, String, null: false, cache_fragment: {expires_in: 5.minutes}\nend\n\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n  end\n\n  def post(id:)\n    cache_fragment(expires_in: 5.minutes) { Post.find(id) }\n  end\nend\n```\n\n## Dataloader\n\nIf you are using [Dataloader](https://graphql-ruby.org/dataloader/overview.html), you will need to let the gem know using `dataloader: true`:\n\n```ruby\nclass PostType \u003c BaseObject\n  field :author, User, null: false\n\n  def author\n    cache_fragment(dataloader: true) do\n      dataloader.with(AuthorDataloaderSource).load(object.id)\n    end\n  end\nend\n\n# or\n\nclass PostType \u003c BaseObject\n  field :author, User, null: false, cache_fragment: {dataloader: true}\n\n  def author\n    dataloader.with(AuthorDataloaderSource).load(object.id)\n  end\nend\n```\n\nThe problem is that I didn't find a way to detect that dataloader (and, therefore, Fiber) is used, and the block is forced to resolve, causing the N+1 inside the Dataloader Source class.\n\n## How to use `#cache_fragment` in extensions (and other places where context is not available)\n\nIf you want to call `#cache_fragment` from places other that fields or resolvers, you'll need to pass `context` explicitly and turn on `raw_value` support. For instance, let's take a look at this extension:\n\n```ruby\nclass Types::QueryType \u003c Types::BaseObject\n  class CurrentMomentExtension \u003c GraphQL::Schema::FieldExtension\n    # turning on cache_fragment support\n    include GraphQL::FragmentCache::ObjectHelpers\n\n    def resolve(object:, arguments:, context:)\n      # context is passed explicitly\n      cache_fragment(context: context) do\n        result = yield(object, arguments)\n        \"#{result} (at #{Time.now})\"\n      end\n    end\n  end\n\n  field :event, String, null: false, extensions: [CurrentMomentExtension]\n\n  def event\n    \"something happened\"\n  end\nend\n```\n\nWith this approach you can use `#cache_fragment` in any place you have an access to the `context`. When context is not available, the error `cannot find context, please pass it explicitly` will be thrown.\n\n## In–memory fragments\n\nIf you have a fragment that accessed from multiple times (e.g., if you have a list of items that belong to the same owner, and owner is cached), you can avoid multiple cache reads by using `:keep_in_context` option:\n\n```ruby\nclass QueryType \u003c BaseObject\n  field :post, PostType, null: true do\n    argument :id, ID, required: true\n  end\n\n  def post(id:)\n    cache_fragment(keep_in_context: true, expires_in: 5.minutes) { Post.find(id) }\n  end\nend\n```\n\nThis can reduce a number of cache calls but _increase_ memory usage, because the value returned from cache will be kept in the GraphQL context until the query is fully resolved.\n\n## Execution errors and caching\n\nSometimes errors happen during query resolving and it might make sense to skip caching for such queries (for instance, imagine a situation when client has no access to the requested field and the backend returns `{ data: {}, errors: [\"you need a permission to fetch orders\"] }`). This is how this behavior can be turned on (_it's off by default!_):\n\n```ruby\nGraphQL::FragmentCache.skip_cache_when_query_has_errors = true\n```\n\nAs a result, caching will be skipped when `errors` array is not empty.\n\n## Disabling the cache\n\nCache processing can be disabled if needed. For example:\n\n```ruby\nGraphQL::FragmentCache.enabled = false if Rails.env.test?\n```\n\n## Cache lookup monitoring\n\nIt may be useful to capture cache lookup events. When monitoring is enabled, the `cache_key`, `operation_name`, `path` and a boolean indicating a cache hit or miss will be sent to a `cache_lookup_event` method. This method can be implemented in your application to handle the event.\n\nExample handler defined in a Rails initializer:\n\n```ruby\nmodule GraphQL\n  module FragmentCache\n    class Fragment\n      def self.cache_lookup_event(**args)\n        # Monitoring such as incrementing a cache hit counter metric\n      end\n    end\n  end\nend\n```\n\nLike managing caching itself, monitoring can be enabled if needed. It is disabled by default. For example:\n\n```ruby\nGraphQL::FragmentCache.monitoring_enabled = true\n```\n\n## Limitations\n\n1. `Schema#execute`, [graphql-batch](https://github.com/Shopify/graphql-batch) and _graphql-ruby-fragment_cache_ do not [play well](https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache/issues/45) together. The problem appears when `cache_fragment` is _inside_ the `.then` block:\n\n```ruby\ndef cached_author_inside_batch\n  AuthorLoader.load(object).then do |author|\n    cache_fragment(author, context: context)\n  end\nend\n```\n\nThe problem is that context is not [properly populated](https://github.com/rmosolgo/graphql-ruby/issues/3397) inside the block (the gem uses `:current_path` to build the cache key). There are two possible workarounds: use [dataloaders](https://graphql-ruby.org/dataloader/overview.html) or manage `:current_path` manually:\n\n```ruby\ndef cached_author_inside_batch\n  outer_path = context.namespace(:interpreter)[:current_path]\n\n  AuthorLoader.load(object).then do |author|\n    context.namespace(:interpreter)[:current_path] = outer_path\n    cache_fragment(author, context: context)\n  end\nend\n```\n\n2. Caching does not work for Union types, because of the `Lookahead` implementation: it requires the exact type to be passed to the `selection` method (you can find the [discussion](https://github.com/rmosolgo/graphql-ruby/pull/3007) here). This method is used for cache key building, and I haven't found a workaround yet ([PR in progress](https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache/pull/30)). If you get `Failed to look ahead the field` error — please pass `path_cache_key` explicitly:\n\n```ruby\nfield :cached_avatar_url, String, null: false\n\ndef cached_avatar_url\n  cache_fragment(path_cache_key: \"post_avatar_url(#{object.id})\") { object.avatar_url }\nend\n```\n\n## Credits\n\nBased on the original [gist](https://gist.github.com/palkan/faad9f6ff1db16fcdb1c071ec50e4190) by [@palkan](https://github.com/palkan) and [@ssnickolay](https://github.com/ssnickolay).\n\nInitially sponsored by [Evil Martians](http://evilmartians.com).\n\n## Contributing\n\nBug reports and pull requests are welcome on GitHub at [https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache](https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache).\n\n## License\n\nThe gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).\n","funding_links":[],"categories":["Ruby"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDmitryTsepelev%2Fgraphql-ruby-fragment_cache","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FDmitryTsepelev%2Fgraphql-ruby-fragment_cache","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDmitryTsepelev%2Fgraphql-ruby-fragment_cache/lists"}