- Notifications
You must be signed in to change notification settings - Fork29.5k
Update pytorch lightning utilities#15310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Update pytorch lightning utilities#15310
Uh oh!
There was an error while loading.Please reload this page.
Conversation
AUTOMATIC1111 commentedMar 18, 2024
What about an option where you assign to |
Dalton-Murray commentedMar 18, 2024 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
I want to confirm what you mean by this, I guess I'm a little confused. This snippet is from ddpm, do you mean something like this: fromtorchvision.utilsimportmake_gridpytorch_lightning.utilities.distributed=pytorch_lightning.utilities.rank_zerofrompytorch_lightning.utilities.rank_zeroimportrank_zero_onlyfromomegaconfimportListConfig If so, this doesn't work because Another way I tried was this, but this doesn't seem to work: importpytorch_lightning.utilities.rank_zeroasrank_zeroimportpytorch_lightning.utilities.distributedasdistributeddistributed=rank_zerodistributed.rank_zero_only() However, this still does require some initial modification of the Perhaps, I am thinking about this or doing this incorrectly. This is another way I thought of but doesn't work: importpytorch_lightning.utilities.rank_zeroasrank_zerodefdistributed():returnrank_zerofrompytorch_lightning.utilities.distributedimportrank_zero_only Something like this would work, but it does change the code (and is technically unnecessary since it doesn't do anything with importpytorch_lightning.utilities.rank_zeroasdistributedfrompytorch_lightning.utilities.rank_zeroimportrank_zero_only If you prefer, I could instead replace this with a try catch/except like this which does work: try:frompytorch_lightning.utilities.distributedimportrank_zero_onlyexcept:frompytorch_lightning.utilities.rank_zeroimportrank_zero_only Sorry for the wall, this was just me trying to figure out a way of doing this |
AUTOMATIC1111 commentedMar 19, 2024
Something like this: |
Dalton-Murray commentedMar 19, 2024
That'd totally work! I just skipped over it initially because I wasn't sure if you wanted to allow importing sys or not because it's not already being imported, will adapt it to current edits |
AUTOMATIC1111 commentedMar 19, 2024
What I'm suggesting is to do this once, somewhere in code that's executed early (ie a function in You just gotta be careful and not replace |
Dalton-Murray commentedMar 21, 2024
That'd make more sense! I'll revert and work on implementing something like that |
Dalton-Murray commentedMar 21, 2024 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
Not sure if you want to keep the print statement in there or not as it's a bit unnecessary I feel like? |
Forgot some things to revert
Fully reverts this time
Dalton-Murray commentedMar 21, 2024
Sorry, I forgot to revert a few lines back, should be good now |
Dalton-Murray commentedMar 21, 2024
Sorry, a bit of a mess, the import was actually definitely required but I moved it further down into the if |
Uh oh!
There was an error while loading.Please reload this page.
Description
Changes
pytorch_lightning.utilities.distributedtopytorch_lightning.utilities.rank_zeroas this is the new/proper place to getrank_zero_onlyand does not exist indistributedanymore. This also (mostly) fixes the issue of it not being able to startup on clean install for those who have this error.This change has existed for quite a few versions of pytorch lightning, and definitely exists in the current version (1.9.4).Docs
This fixes many issues such as:#13785 ,#13464 ,#11642 ,#11501 ,#11458 , and a few others
Unfortunately, this doesn't fix every issue as this does still need to change in the repositoryStability-AI/stablediffusion
Checklist: