You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your amazing work. The thing is I don't have A6000 GPUs at hand. So I modified the input size to 320x320 to adapt to the GPU memory. However, the result is terrible. I would like to know if there are specific settings I need to modify in the code. It seems that it should not relate to the input size. I have modified the 'int(512y1):int(512y2), int(512x1):int(512x2)' in the plms.py. Do I need to modify the 'v1-inferency.ymal' about stable diffusion? I also modify the learning rate in plms.py. Nothing works. Could you please tell me if there is anything I need to pay attention to?
Thank you very much.
The text was updated successfully, but these errors were encountered:
Thank you for your interests in our work and sorry for the late reply. We have updated the code, which should have better efficiency, smaller memory cost (but still ~40 GB), and clearer structure. I believe "plms.py" should be the only script that needs to be changed. However, I guess it would also be a good idea to check the mask positions when changing the resolution. For this purpose, we implement a "plot" function within "ldm/modules/attention.py", which allows you to check the intermediate mask position and make sure they are at the correct positions after reshaping.
Hi there,
Thank you for your amazing work. The thing is I don't have A6000 GPUs at hand. So I modified the input size to 320x320 to adapt to the GPU memory. However, the result is terrible. I would like to know if there are specific settings I need to modify in the code. It seems that it should not relate to the input size. I have modified the 'int(512y1):int(512y2), int(512x1):int(512x2)' in the plms.py. Do I need to modify the 'v1-inferency.ymal' about stable diffusion? I also modify the learning rate in plms.py. Nothing works. Could you please tell me if there is anything I need to pay attention to?
Thank you very much.
The text was updated successfully, but these errors were encountered: