Movatterモバイル変換


[0]ホーム

URL:


Complexity Analyzer

Overview

The Complexity Analyzer in GraphQL.NET is a powerful tool designed to manage the complexity and depth of GraphQL queries.It ensures that queries remain within acceptable bounds to prevent excessive load on the server. This documentation willguide you through the basic and advanced configuration of the complexity analyzer.

Key Features

  • Fields can define the impact for the execution of that field (e.g., how long it will take to execute the ResolveAsyncmethod) separate from a multiplier applied to the children fields (e.g., how many rows it returns for list fields).
  • Default configuration allows for setting three default values:
    • Scalar impact (impact to use for scalar fields)
    • Object impact (impact to use for object fields)
    • Default child multiplier for list fields (multiplier to use for list fields)
  • Per-field configurable behavior to determine field impact and child impact multipliers; by default it uses thescalar impact for scalar fields, object impact for object fields, and the default child multiplier for list fields.
  • Default behavior considers connection semantics, such asfirst,last, andid arguments, to adjust the childmultiplier accordingly.
  • Configuring a schema to ignore complexity for introspection fields is straightforward.
  • Easy to write asynchronous code to implement per-user, per-IP, or throttling limits.

Basic Configuration

Setting Up Complexity Analyzer

services.AddGraphQL(b=> b.AddSchema<MySchema>().AddComplexityAnalyzer(c=>{        c.MaxDepth=10;        c.MaxComplexity=100;}));

Configurable Options

OptionDescriptionDefault Value
MaxDepthLimits the maximum depth of a query.null
MaxComplexityLimits the total complexity of a query.null
DefaultScalarImpactSpecifies the default complexity impact for scalar fields.1
DefaultObjectImpactSpecifies the default complexity impact for object fields.1
DefaultListImpactMultiplierSpecifies the average number of items returned by list fields.20
ValidateComplexityDelegateAllows for custom validation and logging based on query complexity and depth.null
DefaultComplexityImpactDelegateProvides a default mechanism to calculate field impact and child impact multipliers.see below

Default Complexity Impact Delegate

TheDefaultComplexityImpactDelegate is a built-in mechanism in GraphQL.NET that provides a default way to calculatethe complexity impact of fields within a query. By default, this delegate assigns a complexity impact based on the typeof the field being resolved. Scalar fields are given a default impact defined byDefaultScalarImpact, while objectfields are assigned an impact defined byDefaultObjectImpact. For list fields, the delegate multiplies the impactby theDefaultListImpactMultiplier, unless a specific argument likefirst,last, orid is provided, whichthen adjusts the multiplier accordingly (set to 1 if theid argument is present). The delegate also considersconnection semantics, ensuring that the impact is accurately reflected based on parent and child relationshipswithin the query. This default behavior ensures a logical and consistent calculation of query complexity, makingit easier to manage and limit query depth and execution cost.

Sample GraphQL Request and Computed Complexity

The below sample assumes that the complexity analyzer is configured with the default values.

query{#  impact   multiplier   total impact   child multiplier   depthusers(first:10){#     1          1             1                 10          1id#     1         10            11                             2posts{#     1         10            21                 20          2id#     1        200           221                             3comments{#     1        200           421                 20          3id#     1       4000          4421                             4}#}#}#products(id:"5"){#     1          1          4422                  1          1id#     1          1          4423                             2name#     1          1          4424                             2photos{#     1          1          4425                 20          2id#     1         20          4445                             3name#     1         20          4465                             3}#category{#     1          1          4466                  1          2id#     1          1          4467                             3name#     1          1          4468                             3}}}

The above query will have the following complexity calculation:

  • Maximum Depth: 4 (users -> posts -> comments -> id)
  • Total Complexity: 4468

These values are calculated based on these facts demonstrated in the above query:

  • Theusers field requested 10 items, so the child multiplier is set to 10.
  • Theposts field is a list field and uses the default child multiplier of 20.
  • Thecomments field is a list field and uses the default child multiplier of 20.
  • Theproducts field has anid argument, so the child multiplier is set to 1.
  • Thephotos field is a list field and uses the default child multiplier of 20.
  • Thecategory field is not a list field and so does not use the default child multiplier.
  • Other fields are scalar fields and use the default scalar impact of 1, multiplied by themultiplier calculated for that level of the graph.

Example Scenarios

1. Estimating the Total Number of Nodes and Maximum Depth

To configure the complexity analyzer to estimate the total number of nodes returned and/or the maximum depth,you can use the default configuration, or customize the default impact multiplier, or customize the impactmultiplier used for specific fields. The default configuration assumes that list fields return an average of 20 items.

Configuring Child Impact Multiplier for Specific Fields

// Code-firstusersField.WithComplexityImpact(fieldImpact:1,childImpactMultiplier:100);// Assume the users field returns 100 items on average// Schema-first / type-first:[Complexity(fieldImpact:1,childImpactMultiplier:100)]publicstaticIEnumerable<User>Users([FromServices]IUserService userService)=> userService.GetUsers();

Setting a Global Default Multiplier

complexityConfig.DefaultListImpactMultiplier=7;// Assume that other list fields return 7 items on average

Sample GraphQL Request and Computed Complexity

query{#  impact   multiplier   total impact   child multiplier   depthusers{#     1          1             1                100          1id#     1        100           101                             2posts{#     1        100           201                  7          2id#     1        700           901                             3comments{#     1        700          1601                  7          3id#     1       4900          6501                             4}}}}

The above query will have the following complexity calculation:

  • Maximum Depth: 4 (users -> posts -> comments -> id)
  • Total Complexity: 6501

Since the number of rows returned from list fields can vary, it is recommended to use connection fieldsand to require thefirst orlast argument to allow the complexity analzyer to properly estimate thechild multiplier for list fields (or have the default page size set very small). You can also choose toset the scalar and object impact to zero if you prefer to only consider the number of nodes and maximumdepth, similar to theGitHub GraphQL API rate limits.

2. Ignoring or Reducing Impact of Introspection Requests

To prevent introspection requests from affecting the complexity calculation, you can configure the introspectionfields' impact and child multiplier. An extension method is provided to simplify this configuration as shown below:

// Code-first:schema.WithIntrospectionComplexityImpact(0);// Ignore introspection fields// orschema.WithIntrospectionComplexityImpact(0.1);// Reduce impact to 10%// During DI setup:services.AddGraphQL(b=> b.ConfigureSchema(schema=> schema.WithIntrospectionComplexityImpact(0)));

The above method sets the complexity impact and child multiplier for the three meta-fields to the provided value,effectively ignoring or reducing the impact of introspection requests on the complexity calculation.

Sample GraphQL Request and Computed Complexity

{__schema{types{namefields{name}}}}

The above query will have the following complexity calculation:

  • Maximum Depth: 4 (schema -> types -> fields -> name)
  • Total Complexity: 0

Please note that the maximum depth calculation will still include introspection fields.

To ignore introspection fields from the maximum depth calculation, you can write a customcomplexity validation delegate to ignore depth limits for introspection requests:

complexityConfig.ValidateComplexityDelegate=async(context)=>{if(IsIntrospectionRequest(context.ValidationContext)){        context.Error=null;// ignore complexity errors}staticboolIsIntrospectionRequest(ValidationContext validationContext){return validationContext.Document.Definitions.OfType<GraphQLOperationDefinition>().All(            op=> op.Operation== OperationType.Query&& op.SelectionSet.Selections.All(                node=> nodeisGraphQLField field&&(field.Name.Value=="__schema"|| field.Name.Value=="__type")));}};

3. Estimating Computing Power (Database Processing Time)

Another use case for the complexity analyzer is to estimate the computing power required to process a query.You can configure the impact for object fields to estimate the database processing time by setting a custom defaultobject impact or configuring the impact for specific fields. The below examples assume that the scalar impact is 1,but you may wish to adjust this to zero if scalar fields do not require consequential processing time.

Configuring Impact for Object Fields

// Set higher impact for field resolvers that require more processing time// Code-firstusersField.WithComplexityImpact(fieldImpact:50);// Schema-first / type-first:[Complexity(fieldImpact:50)]publicstaticIEnumerable<User>Users([FromServices]IUserService userService)=> userService.GetUsers();

Setting a Custom Default Object Impact

// Set default for object fields (assumed to need to load from a database)complexityConfig.DefaultObjectImpact=20;

Sample GraphQL Request and Computed Complexity

query{#  impact   multiplier   total impact   child multiplier   depthusers{#    50          1            50                 20          1id#     1         20            70                             2posts{#    20         20           470                 20          2id#     1        400           870                             3comments{#    20        400          8870                 20          3id#     1       8000         16870                             4}}}}

The above query will have the following complexity calculation:

  • Maximum Depth: 4 (users -> posts -> comments -> id)
  • Total Complexity: 16870

4. Logging Complexity Results

In addition to validation, theValidateComplexityDelegate property allows you to log complexity resultsfor monitoring or analysis.

complexityConfig.ValidateComplexityDelegate=async(context)=>{// RequestServices may be used to access scoped services within the DI containervar logger= context.ValidationContext.RequestServices!.GetRequiredService<ILogger<MySchema>>();if(context.Error!=null)// failed complexity limits        logger.LogWarning($"Query Complexity:{context.TotalComplexity}, Depth:{context.MaxDepth}");else        logger.LogInformation($"Query Complexity:{context.TotalComplexity}, Depth:{context.MaxDepth}");};

5. Throttling Users Based on Complexity Analysis

To throttle users on a per-user basis similar to GitHub's GraphQL API limits, configure thecomplexity analyzer with a custom validation delegate. As noted above,MaxComplexity andMaxDepth,if set, are still enforced before this delegate runs.

complexityConfig.ValidateComplexityDelegate=async(context)=>{// Skip throttling if the query has already exceeded complexity limitsif(context.Error!=null)return;var services= context.ValidationContext.RequestServices!;// Get the authenticated user, or use the IP address if unauthenticatedvar user= context.User;string key;if(user?.Identity?.IsAuthenticated==true){// For authenticated users, use the user ID        key="name:"+ user.Identity.Name;}else{// For unauthenticated users, use the IP addressvar httpContext= services.GetRequiredService<IHttpContextAccessor>().HttpContext!;        key="ip:"+ httpContext.Connection.RemoteIpAddress.ToString();}// Pull your throttling service (e.g. Polly) from the DI containervar throttlingService= services.GetRequiredService<IThrottlingService>();// Throttle the request based on the complexity, subtracting the complexity from the user's limitvar(allow, remaining)=await throttlingService.ThrottleAsync(key, context.TotalComplexity);// Get the current HttpContextvar httpContext= services.GetRequiredService<IHttpContextAccessor>().HttpContext!;// Add a header indicating the remaining throttling limit    httpContext.Response.Headers["X-RateLimit-Remaining"]= remaining.ToString();// Report an error if the user has exceeded their limitif(!allow){        context.Error=newValidationError($"Query complexity of{context.TotalComplexity} exceeded throttling limit. Remaining:{remaining}");}};

6. Throttling Users Based on Execution Time

While the complexity analyzer does not directly measure execution time, you can useExecutionOptions.Timeout /WithTimeout to control the maximum execution time of a query.See the following documentation for more information:

https://graphql-dotnet.github.io/docs/migrations/migration8/#24-execution-timeout-support

Advanced configurations

Defining Custom Complexity Calculations

To set custom complexity calculations for specific fields, you can use theWithComplexityImpact overloadthat defines a calculation delegate as demonstrated in the following example:

Field<ListGraphType<ProductGraphType>>("products").Argument<IntGraphType>("offset").Argument<IntGraphType>("limit").WithComplexityImpact(context=>{var fieldImpact=1;var childImpactModifier= context.GetArgument<int>("limit",20);// use 20 if unspecifiedreturnnew(fieldImpact, childImpactModifier);});
Edit this page on GitHub

[8]ページ先頭

©2009-2026 Movatter.jp