The Complexity Analyzer in GraphQL.NET is a powerful tool designed to manage the complexity and depth of GraphQL queries.It ensures that queries remain within acceptable bounds to prevent excessive load on the server. This documentation willguide you through the basic and advanced configuration of the complexity analyzer.
first,last, andid arguments, to adjust the childmultiplier accordingly.services.AddGraphQL(b=> b.AddSchema<MySchema>().AddComplexityAnalyzer(c=>{ c.MaxDepth=10; c.MaxComplexity=100;}));| Option | Description | Default Value |
|---|---|---|
| MaxDepth | Limits the maximum depth of a query. | null |
| MaxComplexity | Limits the total complexity of a query. | null |
| DefaultScalarImpact | Specifies the default complexity impact for scalar fields. | 1 |
| DefaultObjectImpact | Specifies the default complexity impact for object fields. | 1 |
| DefaultListImpactMultiplier | Specifies the average number of items returned by list fields. | 20 |
| ValidateComplexityDelegate | Allows for custom validation and logging based on query complexity and depth. | null |
| DefaultComplexityImpactDelegate | Provides a default mechanism to calculate field impact and child impact multipliers. | see below |
TheDefaultComplexityImpactDelegate is a built-in mechanism in GraphQL.NET that provides a default way to calculatethe complexity impact of fields within a query. By default, this delegate assigns a complexity impact based on the typeof the field being resolved. Scalar fields are given a default impact defined byDefaultScalarImpact, while objectfields are assigned an impact defined byDefaultObjectImpact. For list fields, the delegate multiplies the impactby theDefaultListImpactMultiplier, unless a specific argument likefirst,last, orid is provided, whichthen adjusts the multiplier accordingly (set to 1 if theid argument is present). The delegate also considersconnection semantics, ensuring that the impact is accurately reflected based on parent and child relationshipswithin the query. This default behavior ensures a logical and consistent calculation of query complexity, makingit easier to manage and limit query depth and execution cost.
The below sample assumes that the complexity analyzer is configured with the default values.
query{# impact multiplier total impact child multiplier depthusers(first:10){# 1 1 1 10 1id# 1 10 11 2posts{# 1 10 21 20 2id# 1 200 221 3comments{# 1 200 421 20 3id# 1 4000 4421 4}#}#}#products(id:"5"){# 1 1 4422 1 1id# 1 1 4423 2name# 1 1 4424 2photos{# 1 1 4425 20 2id# 1 20 4445 3name# 1 20 4465 3}#category{# 1 1 4466 1 2id# 1 1 4467 3name# 1 1 4468 3}}}The above query will have the following complexity calculation:
These values are calculated based on these facts demonstrated in the above query:
users field requested 10 items, so the child multiplier is set to 10.posts field is a list field and uses the default child multiplier of 20.comments field is a list field and uses the default child multiplier of 20.products field has anid argument, so the child multiplier is set to 1.photos field is a list field and uses the default child multiplier of 20.category field is not a list field and so does not use the default child multiplier.To configure the complexity analyzer to estimate the total number of nodes returned and/or the maximum depth,you can use the default configuration, or customize the default impact multiplier, or customize the impactmultiplier used for specific fields. The default configuration assumes that list fields return an average of 20 items.
// Code-firstusersField.WithComplexityImpact(fieldImpact:1,childImpactMultiplier:100);// Assume the users field returns 100 items on average// Schema-first / type-first:[Complexity(fieldImpact:1,childImpactMultiplier:100)]publicstaticIEnumerable<User>Users([FromServices]IUserService userService)=> userService.GetUsers();complexityConfig.DefaultListImpactMultiplier=7;// Assume that other list fields return 7 items on averagequery{# impact multiplier total impact child multiplier depthusers{# 1 1 1 100 1id# 1 100 101 2posts{# 1 100 201 7 2id# 1 700 901 3comments{# 1 700 1601 7 3id# 1 4900 6501 4}}}}The above query will have the following complexity calculation:
Since the number of rows returned from list fields can vary, it is recommended to use connection fieldsand to require thefirst orlast argument to allow the complexity analzyer to properly estimate thechild multiplier for list fields (or have the default page size set very small). You can also choose toset the scalar and object impact to zero if you prefer to only consider the number of nodes and maximumdepth, similar to theGitHub GraphQL API rate limits.
To prevent introspection requests from affecting the complexity calculation, you can configure the introspectionfields' impact and child multiplier. An extension method is provided to simplify this configuration as shown below:
// Code-first:schema.WithIntrospectionComplexityImpact(0);// Ignore introspection fields// orschema.WithIntrospectionComplexityImpact(0.1);// Reduce impact to 10%// During DI setup:services.AddGraphQL(b=> b.ConfigureSchema(schema=> schema.WithIntrospectionComplexityImpact(0)));The above method sets the complexity impact and child multiplier for the three meta-fields to the provided value,effectively ignoring or reducing the impact of introspection requests on the complexity calculation.
{__schema{types{namefields{name}}}}The above query will have the following complexity calculation:
Please note that the maximum depth calculation will still include introspection fields.
To ignore introspection fields from the maximum depth calculation, you can write a customcomplexity validation delegate to ignore depth limits for introspection requests:
complexityConfig.ValidateComplexityDelegate=async(context)=>{if(IsIntrospectionRequest(context.ValidationContext)){ context.Error=null;// ignore complexity errors}staticboolIsIntrospectionRequest(ValidationContext validationContext){return validationContext.Document.Definitions.OfType<GraphQLOperationDefinition>().All( op=> op.Operation== OperationType.Query&& op.SelectionSet.Selections.All( node=> nodeisGraphQLField field&&(field.Name.Value=="__schema"|| field.Name.Value=="__type")));}};Another use case for the complexity analyzer is to estimate the computing power required to process a query.You can configure the impact for object fields to estimate the database processing time by setting a custom defaultobject impact or configuring the impact for specific fields. The below examples assume that the scalar impact is 1,but you may wish to adjust this to zero if scalar fields do not require consequential processing time.
// Set higher impact for field resolvers that require more processing time// Code-firstusersField.WithComplexityImpact(fieldImpact:50);// Schema-first / type-first:[Complexity(fieldImpact:50)]publicstaticIEnumerable<User>Users([FromServices]IUserService userService)=> userService.GetUsers();// Set default for object fields (assumed to need to load from a database)complexityConfig.DefaultObjectImpact=20;query{# impact multiplier total impact child multiplier depthusers{# 50 1 50 20 1id# 1 20 70 2posts{# 20 20 470 20 2id# 1 400 870 3comments{# 20 400 8870 20 3id# 1 8000 16870 4}}}}The above query will have the following complexity calculation:
In addition to validation, theValidateComplexityDelegate property allows you to log complexity resultsfor monitoring or analysis.
complexityConfig.ValidateComplexityDelegate=async(context)=>{// RequestServices may be used to access scoped services within the DI containervar logger= context.ValidationContext.RequestServices!.GetRequiredService<ILogger<MySchema>>();if(context.Error!=null)// failed complexity limits logger.LogWarning($"Query Complexity:{context.TotalComplexity}, Depth:{context.MaxDepth}");else logger.LogInformation($"Query Complexity:{context.TotalComplexity}, Depth:{context.MaxDepth}");};To throttle users on a per-user basis similar to GitHub's GraphQL API limits, configure thecomplexity analyzer with a custom validation delegate. As noted above,MaxComplexity andMaxDepth,if set, are still enforced before this delegate runs.
complexityConfig.ValidateComplexityDelegate=async(context)=>{// Skip throttling if the query has already exceeded complexity limitsif(context.Error!=null)return;var services= context.ValidationContext.RequestServices!;// Get the authenticated user, or use the IP address if unauthenticatedvar user= context.User;string key;if(user?.Identity?.IsAuthenticated==true){// For authenticated users, use the user ID key="name:"+ user.Identity.Name;}else{// For unauthenticated users, use the IP addressvar httpContext= services.GetRequiredService<IHttpContextAccessor>().HttpContext!; key="ip:"+ httpContext.Connection.RemoteIpAddress.ToString();}// Pull your throttling service (e.g. Polly) from the DI containervar throttlingService= services.GetRequiredService<IThrottlingService>();// Throttle the request based on the complexity, subtracting the complexity from the user's limitvar(allow, remaining)=await throttlingService.ThrottleAsync(key, context.TotalComplexity);// Get the current HttpContextvar httpContext= services.GetRequiredService<IHttpContextAccessor>().HttpContext!;// Add a header indicating the remaining throttling limit httpContext.Response.Headers["X-RateLimit-Remaining"]= remaining.ToString();// Report an error if the user has exceeded their limitif(!allow){ context.Error=newValidationError($"Query complexity of{context.TotalComplexity} exceeded throttling limit. Remaining:{remaining}");}};While the complexity analyzer does not directly measure execution time, you can useExecutionOptions.Timeout /WithTimeout to control the maximum execution time of a query.See the following documentation for more information:
https://graphql-dotnet.github.io/docs/migrations/migration8/#24-execution-timeout-support
To set custom complexity calculations for specific fields, you can use theWithComplexityImpact overloadthat defines a calculation delegate as demonstrated in the following example:
Field<ListGraphType<ProductGraphType>>("products").Argument<IntGraphType>("offset").Argument<IntGraphType>("limit").WithComplexityImpact(context=>{var fieldImpact=1;var childImpactModifier= context.GetArgument<int>("limit",20);// use 20 if unspecifiedreturnnew(fieldImpact, childImpactModifier);});