Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork1.9k
MergeAnswersNode module understanding#712
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
-
I am trying to use SmartScraperMultiGraph and have a small doubt regarding its operation. When answers from all the URLs are merged using the MergeAnswersNode module, it sends all those answers to the LLM at once. Are there chances that the combined size might exceed the context size of the LLM? I cannot see any chunking or other methods to reduce the context size in the MergeAnswersNode module. Is my understanding correct? |
BetaWas this translation helpful?Give feedback.
All reactions
Replies: 2 comments 3 replies
-
The chunking is made in the parse node, it should not create a context window overlap |
BetaWas this translation helpful?Give feedback.
All reactions
-
Thanks for the reply@VinciGit00. Suppose there are 50 URLs, MergeAnswersNode will combine the results from all of these and send them to LLM at once. Is my understanding correct? If yes, couldn't these combined answers potentially exceed LLMs context size? |
BetaWas this translation helpful?Give feedback.
All reactions
-
not once, it depends of the tokens of your llm, if they overlap it they will use chunks |
BetaWas this translation helpful?Give feedback.
All reactions
-
Sorry if the question seems very basic, but I couldn't find this information in the code for the MergeAnswersNode Module. It is just merging all the answers into 1 string and sending that complete string to PromptTemplate. Is it the feature of Langchain's PromptTemplate to create chunks based on LLM's context size? |
BetaWas this translation helpful?Give feedback.
All reactions
-
yes |
BetaWas this translation helpful?Give feedback.