Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add Streaming of Function Call Arguments to Chat Completions#999

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged

Conversation

devtalker
Copy link
Contributor

Summary

This PR implements real-time streaming of function call arguments as requested in#834. Previously, function call arguments were only emitted after the entire function call was complete, causing poor user experience for large parameter generation.

Changes

  • EnhancedChatCmplStreamHandler: Added real-time streaming of function call arguments during generation
  • New streaming logic: Function call arguments now stream incrementally as they are generated, similar to text content
  • Backward compatibility: Maintains existing behavior for completed function calls
  • Comprehensive testing: Added tests for both OpenAI and LiteLLM models
  • Example implementation: Created demonstration code showing the new streaming capability

Closes#834

BarAshkenazi reacted with heart emoji
@devtalkerdevtalker marked this pull request as ready for reviewJuly 3, 2025 11:01
@devtalkerdevtalker changed the title# Support Streaming of Function Call ArgumentsSupport Streaming of Function Call ArgumentsJul 3, 2025
@devtalker
Copy link
ContributorAuthor

Hey@seratch@rm-openai! 👋

Would you mind taking a look at this PR? I've implemented real-time streaming for function call arguments (fixes#834). Basically, instead of waiting for the entire function call to complete, users now see the arguments being built up incrementally as the LLM generates them. This should make the experience much smoother for things like code generation and API building.

Let me know what you think! 🚀

@devtalkerdevtalkerforce-pushed thefeature/function-call-args-streaming branch from6687ab8 to3ee4d63CompareJuly 7, 2025 07:14
Copy link
Member

@seratchseratch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Thanks for your pull request. Left a few comments.

@seratchseratch changed the titleSupport Streaming of Function Call ArgumentsAdd Streaming of Function Call Arguments to Chat CompletionsJul 10, 2025
@devtalkerdevtalker requested a review fromseratchJuly 10, 2025 12:22
@devtalkerdevtalkerforce-pushed thefeature/function-call-args-streaming branch fromfe8ba90 to34b1754CompareJuly 11, 2025 02:23
@devtalker
Copy link
ContributorAuthor

@rm-openai@seratch hi there, just wanted to gently check in and see if you had a chance to review the PR.

Copy link
Collaborator

@rm-openairm-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Hey@devtalker this looks generally pretty good! However I did some digging and turns out this comment that I wrote is inaccurate in practice:

 # Because we don't know the name of the function until the end of the stream, we'll save everything ...

The function name is actually correct the very first time, and doesn't change. AFAICT this applies to all LLM providers.

So you should be able to simplify the code a decent amount!

@devtalker
Copy link
ContributorAuthor

devtalker commentedJul 14, 2025
edited
Loading

@rm-openai Thanks for the feedback! I've simplified the function call name logic based on your observation that function names are correct from the first chunk. Please review the changes when you have a chance

@devtalker
Copy link
ContributorAuthor

Hey@rm-openai! 👋 Could you take a quick look? We need this feature ASAP for better UX 🚀

@rm-openairm-openai merged commit00c87ea intoopenai:mainJul 16, 2025
5 checks passed
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@rm-openairm-openairm-openai approved these changes

@seratchseratchAwaiting requested review from seratch

Assignees
No one assigned
Labels
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

Support Streaming of Function Call Arguments
3 participants
@devtalker@seratch@rm-openai

[8]ページ先頭

©2009-2025 Movatter.jp