Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

fix: Issue #34139 - Explicitly disable callbacks in LLMToolSelectorMiddleware#34351

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
Manju080 wants to merge2 commits intolangchain-ai:master
base:master
Choose a base branch
Loading
fromManju080:fix-tool-selector-disable-streaming

Conversation

@Manju080
Copy link

@Manju080Manju080 commentedDec 14, 2025
edited
Loading

Fixes#34139

When usingLLMToolSelectorMiddleware with stream_mode="messages", the internal tool selection tokens (the JSON output) are leaked into the final user stream. This happens because the internal LLM call inherits the parent's callback manager, which streams every token generated, even those intended only for middleware logic.

This PR introduces a fix to prevent the leakage:

  1. Addeddisable_streaming parameter to the middleware init (defaults toTrue) to give users control.
  2. Explicitly cleared Callbacks, instead of relying on model.bind(streaming=False) which is not respected by all model providers (e.g ChatOllama still leaks callback events) this implementation passes config={"callbacks": []} to the internal invoke and ainvoke calls.

I verified this locally using a reproduction script with ChatOllama (llama3.2).
Before: Output contained mixed internal JSON tokens({"tools": ...}).
After: Output is clean and contains only the final response tokens.

@github-actionsgithub-actionsbot added the langchain`langchain` package issues & PRs labelDec 14, 2025
@Manju080Manju080 changed the titleFix issue #34139: Explicitly disable callbacks in LLMToolSelectorMiddlewarefix: Issue #34139 - Explicitly disable callbacks in LLMToolSelectorMiddlewareDec 14, 2025
@github-actionsgithub-actionsbot added fixFor PRs that implement a fix and removed fixFor PRs that implement a fix labelsDec 14, 2025
@Manju080
Copy link
Author

@sydney-runkle could you please look into this PR and let me know about any changes and updates.

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment

Reviewers

No reviews

Assignees

No one assigned

Labels

fixFor PRs that implement a fixlangchain`langchain` package issues & PRs

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

Option to disable streaming for LLMToolSelectorMiddleware internal LLM calls

1 participant

@Manju080

[8]ページ先頭

©2009-2025 Movatter.jp