- Notifications
You must be signed in to change notification settings - Fork845
add support for background responses#6854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
SergeyMenshykh commentedSep 26, 2025
@dotnet-policy-service agree company="Microsoft" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Pull Request Overview
This PR introduces support for background responses (long-running operations) in the AI chat framework. The feature enables clients to initiate operations that can be polled or resumed from interruption points using continuation tokens.
- Adds
ResumptionTokenbase class andBackgroundResponsesOptionsfor continuation token support - Updates
ChatOptions,ChatResponse, andChatResponseUpdatewith background response properties - Implements background response functionality in
OpenAIResponsesChatClientwith polling and stream resumption
Reviewed Changes
Copilot reviewed 23 out of 23 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| ResumptionToken.cs | Introduces base class for continuation tokens with byte serialization |
| BackgroundResponsesOptions.cs | Defines options for enabling background responses |
| ChatOptions.cs | Adds continuation token and background response options properties |
| ChatResponse.cs | Adds continuation token property for background response polling |
| ChatResponseUpdate.cs | Adds continuation token property for stream resumption |
| ChatClientExtensions.cs | Provides extension methods for background response operations |
| OpenAIResponsesContinuationToken.cs | Implements OpenAI-specific continuation token with JSON serialization |
| OpenAIResponsesChatClient.cs | Core implementation of background response support with polling and streaming |
| FunctionInvokingChatClient.cs | Updates to handle continuation tokens in function calling scenarios |
Tip: Customize your code reviews with copilot-instructions.md.Create the file orlearn how to get started.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
test/Libraries/Microsoft.Extensions.AI.Integration.Tests/VerbatimHttpHandler.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.csShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.Abstractions/BackgroundResponsesOptions.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.Abstractions/BackgroundResponsesOptions.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.Abstractions/BackgroundResponsesOptions.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatClientExtensions.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesContinuationToken.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesContinuationToken.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesContinuationToken.csShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesContinuationToken.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI/Microsoft.Extensions.AI.csproj OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
…atClient.csCo-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
…ntinuationToken.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…ykh/extensions into background-responses
…gs in code generators.
…ykh/extensions into background-responses
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.csShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
…ion/ChatOptions.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…ion/ChatOptions.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…ion/ChatResponseUpdate.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…ion/ChatOptions.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
Co-authored-by: Stephen Toub <stoub@microsoft.com>
…tinuationToken.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…tinuationToken.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…tinuationToken.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
…atClient.csCo-authored-by: Stephen Toub <stoub@microsoft.com>
src/Libraries/Microsoft.Extensions.AI.Abstractions/ChatCompletion/ChatOptions.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs OutdatedShow resolvedHide resolved
Uh oh!
There was an error while loading.Please reload this page.
stephentoub commentedOct 10, 2025
@SergeyMenshykh, why do you keep merging in main? It keeps resetting CI. |
1eb963e intodotnet:mainUh oh!
There was an error while loading.Please reload this page.
* add support for background responsesAdds a model for supporting background responses (long-running operations), updates the chat completion model to use it, and integrates this functionality into OpenAIResponsesChatClient.---------Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>Co-authored-by: Stephen Toub <stoub@microsoft.com>Co-authored-by: westey <164392973+westey-m@users.noreply.github.com>
* add support for background responsesAdds a model for supporting background responses (long-running operations), updates the chat completion model to use it, and integrates this functionality into OpenAIResponsesChatClient.---------Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>Co-authored-by: Stephen Toub <stoub@microsoft.com>Co-authored-by: westey <164392973+westey-m@users.noreply.github.com>
* add support for background responsesAdds a model for supporting background responses (long-running operations), updates the chat completion model to use it, and integrates this functionality into OpenAIResponsesChatClient.---------Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>Co-authored-by: Stephen Toub <stoub@microsoft.com>Co-authored-by: westey <164392973+westey-m@users.noreply.github.com>
Uh oh!
There was an error while loading.Please reload this page.
This PR adds a model for supporting background responses (long-running operations), updates the chat completion model to use it, and integrates this functionality into
OpenAIResponsesChatClient.Background responses use a continuation token returned by the
GetResponseAsyncandGetStreamingResponseAsyncmethods in theChatResponseandChatResponseUpdateclasses, respectively, when background responses are enabled and supported by the chat client.The continuation token contains all necessary details to enable polling for a background response using the non-streaming
GetResponseAsyncmethod. It also allows resuming a streamed background response with theGetStreamingResponseAsyncmethod if the stream is interrupted. In both cases, the continuation token obtained from the initial chat result or the streamed update received before the interruption should be supplied as input for follow-up calls.When a background response has completed, failed, or cannot proceed further (for example, when user input is required), the continuation token returned by either method will be null, signaling to consumers that processing is complete and there is nothing to poll or resume. The result returned by either method can then be used for further processing.
Non-streaming API:
Streaming API:
Microsoft Reviewers:Open in CodeFlow
CC:@westey-m