- Notifications
You must be signed in to change notification settings - Fork29.5k
[Performance 5/6] Prevent unnecessary extra networks bias backup#15816
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
AG-w commentedMay 16, 2024
is it possible to make this compatible with extension that changing lora weight during sampling? https://github.com/cheald/sd-webui-loractl |
huchenlei commentedMay 16, 2024
Do you mean that this change will cause dynamic lora weight not working? |
AG-w commentedMay 17, 2024
the latest commit fixed it, thanks |
altoiddealer commentedMay 17, 2024
Whoa baby, I am SO looking forward to Forge speeds + ability to use loractl extension. Absolutely amazing. |
Uh oh!
There was an error while loading.Please reload this page.
Description
According tolllyasviel/stable-diffusion-webui-forge#716 (comment) , network_apply always incur some overhead on copying tensors even when no extra networks are enabled. This PR prevents this behaviour. The performance gain is about 25ms/it.
Previously unnecessary weight copy was prevented inhttps://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/12599/files . This PR is just a follow-up and apply the same approach on bias backup.
Screenshots/videos:
Checklist: