Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

graphql-ruby plugin for caching parts of the response

License

NotificationsYou must be signed in to change notification settings

inplayinnovation/graphql-ruby-fragment_cache

 
 

Repository files navigation

GraphQL::FragmentCache powers upgraphql-ruby with the ability to cache responsefragments: you can mark any field as cached and it will never be resolved again (at least, while cache is valid). For instance, the following code cachestitle for each post:

classPostType <BaseObjectfield:id,ID,null:falsefield:title,String,null:false,cache_fragment:trueend

You can support my open–source workhere.

Getting started

Add the gem to your Gemfilegem 'graphql-fragment_cache' and add the plugin to your schema class:

classGraphqSchema <GraphQL::SchemauseGraphQL::FragmentCachequeryQueryTypeend

IncludeGraphQL::FragmentCache::Object to your base type class:

classBaseType <GraphQL::Schema::ObjectincludeGraphQL::FragmentCache::Objectend

If you're usingresolvers — include the module into the base resolver as well:

classResolvers::BaseResolver <GraphQL::Schema::ResolverincludeGraphQL::FragmentCache::ObjectHelpersend

Now you can addcache_fragment: option to your fields to turn caching on:

classPostType <BaseObjectfield:id,ID,null:falsefield:title,String,null:false,cache_fragment:trueend

Alternatively, you can usecache_fragment method inside resolver methods:

classQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueenddefpost(id:)cache_fragment{Post.find(id)}endend

Cache key generation

Cache keys consist of the following parts: namespace, implicit key, and explicit key.

Cache namespace

The namespace is prefixed to every cached key. The default namespace isgraphql, which is configurable:

GraphQL::FragmentCache.namespace="graphql"

Implicit cache key

Implicit part of a cache key contains the information about the schema and the current query. It includes:

  • Hex gsdigest of the schema definition (to make sure cache is cleared when the schema changes).
  • The current query fingerprint consisting of apath to the field, arguments information and the selections set.

Let's take a look at the example:

query=<<~GQL  query {    post(id: 1) {      id      title      cachedAuthor {        id        name      }    }  }GQLschema_cache_key=GraphqSchema.schema_cache_keypath_cache_key="post(id:1)/cachedAuthor"selections_cache_key="[#{%w[idname].join(".")}]"query_cache_key=Digest::SHA1.hexdigest("#{path_cache_key}#{selections_cache_key}")cache_key="#{schema_cache_key}/#{query_cache_key}/#{object_cache_key}"

You can overrideschema_cache_key,query_cache_key,path_cache_key orobject_cache_key by passing parameters to thecache_fragment calls:

classQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueenddefpost(id:)cache_fragment(query_cache_key:"post(#{id})"){Post.find(id)}endend

Overridingpath_cache_key might be helpful when you resolve the same object nested in multiple places (e.g.,Post andComment both haveauthor), but want to make sure cache will be invalidated when selection set is different.

Same for the option:

classPostType <BaseObjectfield:id,ID,null:falsefield:title,String,null:false,cache_fragment:{query_cache_key:"post_title"}end

Overridingobject_cache_key is helpful in the case where the value that is cached is different than the one used as a key, like a database query that is pre-processed before caching.

classQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueenddefpost(id:)query=Post.where("updated_at < ?",Time.now -1.day)cache_fragment(object_cache_key:query.cache_key){query.some_process}endend

Query arguments processing

You can influence the way that graphql arguments are include in the cache key.

A use case might be a:renew_cache parameter that can be used to force a cache rewrite,but should not be included with the cache key itself. Usecache_key: { exclude_arguments: […]}to specify a list of arguments to be excluded from the implicit cache key.

classQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueargument:renew_cache,Boolean,required:falseenddefpost(id:,renew_cache:false)ifrenew_cachecontext.scoped_set!(:renew_cache,true)endcache_fragment(cache_key:{exclude_arguments:[:renew_cache]}){Post.find(id)}endend

Likewise, you can usecache_key: { include_arguments: […] } to specify an allowlist of argumentsto be included in the cache key. In this case all arguments for the cache key must be specified, includingparent arguments of nested fields.

User-provided cache key (custom key)

In most cases you want your cache key to depend on the resolved object (say,ActiveRecord model). You can do that by passing an argument to the#cache_fragment method in a similar way to Rails views#cache method:

defpost(id:)post=Post.find(id)cache_fragment(post){post}end

You can pass arrays as well to build a compound cache key:

defpost(id:)post=Post.find(id)cache_fragment([post,current_account]){post}end

You can omit the block if its return value is the same as the cached object:

# the following linecache_fragment(post)# is the same ascache_fragment(post){post}

Using literals: Even when using a same string for all queries, the cache changes per argument and per selection set (because of the query_key).

defpost(id:)cache_fragment("find_post"){Post.find(id)}end

Combining with options:

defpost(id:)cache_fragment("find_post",expires_in:5.minutes){Post.find(id)}end

Dynamic cache key:

defpost(id:)last_updated_at=Post.select(:updated_at).find_by(id:id)&.updated_atcache_fragment(last_updated_at,expires_in:5.minutes){Post.find(id)}end

Note the usage of.select(:updated_at) at the cache key field to make this verifying query as fastest and light as possible.

You can also add touch options for the belongs_to association e.g author'sbelongs_to: :post to have atouch: true.So that it invalidates the Post when the author is updated.

When usingcache_fragment: option, it's only possible to use the resolved value as a cache key by setting:

field:post,PostType,null:true,cache_fragment:{cache_key::object}doargument:id,ID,required:trueend# this is equal todefpost(id:)cache_fragment(Post.find(id))end

Also, you can pass:value to thecache_key: argument to use the returned value to build a key:

field:post,PostType,null:true,cache_fragment:{cache_key::value}doargument:id,ID,required:trueend# this is equal todefpost(id:)post=Post.find(id)cache_fragment(post){post}end

If you need more control, you can setcache_key: to any custom code:

field:posts,Types::Objects::PostType.connection_type,cache_fragment:{cache_key:->{object.posts.maximum(:created_at)}}

The way cache key part is generated for the passed argument is the following:

  • Useobject_cache_key: "some_cache_key" if passed tocache_fragment
  • Use#graphql_cache_key if implemented.
  • Use#cache_key (or#cache_key_with_version for modern Rails) if implemented.
  • Useself.to_s forprimitive types (strings, symbols, numbers, booleans).
  • RaiseArgumentError if none of the above.

Context cache key

By default, we do not take context into account when calculating cache keys. That's because caching is more efficient when it'scontext-free.

However, if you want some fields to be cached per context, you can do that either by passing context objects directly to the#cache_fragment method (see above) or by adding acontext_key option tocache_fragment:.

For instance, imagine a query that allows the current user's social profiles:

query {socialProfiles {providerid  }}

You can cache the result using the context (context[:user]) as a cache key:

classQueryType <BaseObjectfield:social_profiles,[SocialProfileType],null:false,cache_fragment:{context_key::user}defsocial_profilescontext[:user].social_profilesendend

This is equal to using#cache_fragment the following way:

classQueryType <BaseObjectfield:social_profiles,[SocialProfileType],null:falsedefsocial_profilescache_fragment(context[:user]){context[:user].social_profiles}endend

Conditional caching

Use theif: (orunless:) option:

defpost(id:)cache_fragment(if:current_user.nil?){Post.find(id)}end# orfield:post,PostType,cache_fragment:{if:->{current_user.nil?}}doargument:id,ID,required:trueend# orfield:post,PostType,cache_fragment:{if::current_user?}doargument:id,ID,required:trueend

Default options

You can configure default options that will be passed to allcache_fragmentcalls andcache_fragment: configurations. For example:

GraphQL::FragmentCache.configuredo |config|config.default_options={expires_in:1.hour,# Expire cache keys after 1 hourschema_cache_key:nil# Do not clear the cache on each schema change}end

Renewing the cache

You can force the cache to renew during query execution by addingrenew_cache: true to the query context:

MyAppSchema.execute("query { posts { title } }",context:{renew_cache:true})

This will treat any cached value as missing even if it's present, and storefresh new computed values in the cache. This can be useful for cache warmers.

Cache storage and options

It's up to your to decide which caching engine to use, all you need is to configure the cache store:

GraphQL::FragmentCache.configuredo |config|config.cache_store=MyCacheStore.newend

Or, in Rails:

# config/application.rb (or config/environments/<environment>.rb)Rails.application.configuredo |config|# arguments and options are the same as for `config.cache_store`config.graphql_fragment_cache.store=:redis_cache_storeend

⚠️ Cache store must implement#read(key),#exist?(key) and#write_multi(hash, **options) or#write(key, value, **options) methods.

The gem provides only in-memory store out-of-the-box (GraphQL::FragmentCache::MemoryStore). It's used by default.

You can pass store-specific options to#cache_fragment orcache_fragment:. For example, to set expiration (assuming the store's#write method supportsexpires_in option):

classPostType <BaseObjectfield:id,ID,null:falsefield:title,String,null:false,cache_fragment:{expires_in:5.minutes}endclassQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueenddefpost(id:)cache_fragment(expires_in:5.minutes){Post.find(id)}endend

Dataloader

If you are usingDataloader, you will need to let the gem know usingdataloader: true:

classPostType <BaseObjectfield:author,User,null:falsedefauthorcache_fragment(dataloader:true)dodataloader.with(AuthorDataloaderSource).load(object.id)endendend# orclassPostType <BaseObjectfield:author,User,null:false,cache_fragment:{dataloader:true}defauthordataloader.with(AuthorDataloaderSource).load(object.id)endend

The problem is that I didn't find a way to detect that dataloader (and, therefore, Fiber) is used, and the block is forced to resolve, causing the N+1 inside the Dataloader Source class.

How to use#cache_fragment in extensions (and other places where context is not available)

If you want to call#cache_fragment from places other that fields or resolvers, you'll need to passcontext explicitly and turn onraw_value support. For instance, let's take a look at this extension:

classTypes::QueryType <Types::BaseObjectclassCurrentMomentExtension <GraphQL::Schema::FieldExtension# turning on cache_fragment supportincludeGraphQL::FragmentCache::ObjectHelpersdefresolve(object:,arguments:,context:)# context is passed explicitlycache_fragment(context:context)doresult=yield(object,arguments)"#{result} (at#{Time.now})"endendendfield:event,String,null:false,extensions:[CurrentMomentExtension]defevent"something happened"endend

With this approach you can use#cache_fragment in any place you have an access to thecontext. When context is not available, the errorcannot find context, please pass it explicitly will be thrown.

In–memory fragments

If you have a fragment that accessed from multiple times (e.g., if you have a list of items that belong to the same owner, and owner is cached), you can avoid multiple cache reads by using:keep_in_context option:

classQueryType <BaseObjectfield:post,PostType,null:truedoargument:id,ID,required:trueenddefpost(id:)cache_fragment(keep_in_context:true,expires_in:5.minutes){Post.find(id)}endend

This can reduce a number of cache calls butincrease memory usage, because the value returned from cache will be kept in the GraphQL context until the query is fully resolved.

Execution errors and caching

Sometimes errors happen during query resolving and it might make sense to skip caching for such queries (for instance, imagine a situation when client has no access to the requested field and the backend returns{ data: {}, errors: ["you need a permission to fetch orders"] }). This is how this behavior can be turned on (it's off by default!):

GraphQL::FragmentCache.skip_cache_when_query_has_errors=true

As a result, caching will be skipped whenerrors array is not empty.

Disabling the cache

Cache processing can be disabled if needed. For example:

GraphQL::FragmentCache.enabled=falseifRails.env.test?

Cache lookup monitoring

It may be useful to capture cache lookup events. When monitoring is enabled, thecache_key,operation_name,path and a boolean indicating a cache hit or miss will be sent to acache_lookup_event method. This method can be implemented in your application to handle the event.

Example handler defined in a Rails initializer:

moduleGraphQLmoduleFragmentCacheclassFragmentdefself.cache_lookup_event(**args)# Monitoring such as incrementing a cache hit counter metricendendendend

Like managing caching itself, monitoring can be enabled if needed. It is disabled by default. For example:

GraphQL::FragmentCache.monitoring_enabled=true

Limitations

  1. Schema#execute,graphql-batch andgraphql-ruby-fragment_cache do notplay well together. The problem appears whencache_fragment isinside the.then block:
defcached_author_inside_batchAuthorLoader.load(object).thendo |author|cache_fragment(author,context:context)endend

The problem is that context is notproperly populated inside the block (the gem uses:current_path to build the cache key). There are two possible workarounds: usedataloaders or manage:current_path manually:

defcached_author_inside_batchouter_path=context.namespace(:interpreter)[:current_path]AuthorLoader.load(object).thendo |author|context.namespace(:interpreter)[:current_path]=outer_pathcache_fragment(author,context:context)endend
  1. Caching does not work for Union types, because of theLookahead implementation: it requires the exact type to be passed to theselection method (you can find thediscussion here). This method is used for cache key building, and I haven't found a workaround yet (PR in progress). If you getFailed to look ahead the field error — please passpath_cache_key explicitly:
field:cached_avatar_url,String,null:falsedefcached_avatar_urlcache_fragment(path_cache_key:"post_avatar_url(#{object.id})"){object.avatar_url}end

Credits

Based on the originalgist by@palkan and@ssnickolay.

Initially sponsored byEvil Martians.

Contributing

Bug reports and pull requests are welcome on GitHub athttps://github.com/DmitryTsepelev/graphql-ruby-fragment_cache.

License

The gem is available as open source under the terms of theMIT License.

About

graphql-ruby plugin for caching parts of the response

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Ruby99.8%
  • Other0.2%

[8]ページ先頭

©2009-2025 Movatter.jp