-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update torch and other dependencies to make it work on 24.04 #15716
base: dev
Are you sure you want to change the base?
Update torch and other dependencies to make it work on 24.04 #15716
Conversation
4ac789f
to
60031e2
Compare
60031e2
to
8f8ea14
Compare
you forgot AMD changes |
I tried this out on amdgpu on arch linux and it worked fine actually 👍 |
@HinaHyugaHime what do you mean by amd changes? I would be glad to provide them. |
Torch version |
CLIPTextModel_from_pretrained change should not be there. This code is to prevent loading CLIP model from the web as its weights are already included into the checkpoint, and removing None disables that. As for the rest, I'm generally against updating versions without an explicit need for it... Does this all work on Windows with recommended python 3.10.6? |
@janbernloehr I finally fixed Torch hell for MacOS with this PR, so please do not modify Have you tried just to change python and torch in Something like:
If that works I have another suggestion.
@AUTOMATIC1111 If what I proposed above works, I would suggest similar approach as we have for MacOS. Something like:
I am not sure what some_other_check should be. Something like |
@janbernloehr I managed to run A1111 with 3.12 on my Mac with just a few minor changes. I only have Ubuntu 20.04.6 (on a server without GPU) so I can't test if this work on 24.x with Nvidia. I only changed this: webui-user sh # python3 executable
- #python_cmd="python3"
+ python_cmd="python3.12"
# install command for torch
- #export TORCH_COMMAND="pip install torch==1.12.1+cu113 --extra-index-url https://download.pytorch.org/whl/cu113"
+ export TORCH_COMMAND="pip install torch==2.3.0 torchvision==0.18.0" requirements_versions.txt - transformers==4.30.2
+ transformers==4.41.2 I haven't changed requirements.txt, since it is for colab users. And I used Thats all. After I remove venv, on the first run I am always getting this error, even
But if I just rerun ./webui again it works 🤷🏻♂️ I am getting the error below every time, since I haven't replace
Otherwise it works fine as far as I can tell. @AUTOMATIC1111 If you don't mind, I will reopen #13667 with a steps to reproduce and a note that I noticed that problem only with 3.12 and not with 3.10. @janbernloehr that is the reason I am not using the Slow method works just fine:
All test passed:
and basic generation works fine (I just did some basic tests):
|
@viking1304 : thanks for your input - indeed this is very minimal changes! My intention of this PR was to fix all the weird things too so that’s why I in the end updated a lot more deps. But I see that this might cause some unforeseen problems for some users. |
EDIT:
from https://github.com/pytorch/pytorch/releases/tag/v2.3.0 So, it would be better to find another solution instead of trying to make a1111 run on Python 3.12 since PyTorch, which is the most important package, does not work with 3.12 properly. Can you please try this?
|
On the last manjaro it's needed now to install python311 (don not confuse with python3.11) from yay, because the system dropped python 3.11 support. So this PR is useful not only for ubuntu users |
Thank you for the pull request, I was able to use it to make everything work and was able to reproduce a previous generation on my Ubuntu 24.04 install. |
This PR changes too many unnecessary things and might compromise other systems. What I did here is enough and do not break anything. Torch only partially works on 3.12, so some things might not work. @AUTOMATIC1111 since 3.10 can be installed from deadsnakes/ppa now on Ubuntu 24, I would suggest closing this PR and putting a note that users can install 3.10 like this:
@janbernloehr @faattori, can you confirm that you can install 3.10 like this? |
So far I haven't encountered anything that would not work on python3.12 regarding torch, so I am going to keep on experimenting and see when and what breaks. But so far I have no reason to install python3.10. |
So I'm running a fresh install of Linux Mint 22 based on Ubuntu 24.04. I couldn't remove python 3.12.
then change line 47 in webui.sh to point to python3.10 rather than python3 A1111 is now installed and I've been able to generate an image. Im gonna keep going and see if I encounter any othr issue tomorrow. |
I would like to point out that PyTorch 2.4, which released some 3 weeks ago, DOES fully support Python 3.12 now. 2.4 also supports CUDA 12.4 instead of 12.1, so that'd be a nice target. Side note: Upcoming 2.4.1 seems to support CUDA 12.5 |
With
However adjusting the version from pydantic to 1.10.18 seems to fix that. So maybe this could be adjusted before getting merged? (Tested on Archlinux with Python 3.12.5) |
@@ -65,7 +65,7 @@ def create_model_and_transforms_without_pretrained(*args, pretrained=None, **kwa | |||
return self.create_model_and_transforms(*args, pretrained=None, **kwargs) | |||
|
|||
def CLIPTextModel_from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs): | |||
res = self.CLIPTextModel_from_pretrained(None, *model_args, config=pretrained_model_name_or_path, state_dict={}, **kwargs) | |||
res = self.CLIPTextModel_from_pretrained(pretrained_model_name_or_path, *model_args, config=pretrained_model_name_or_path, state_dict={}, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
config=pretrained_model_name_or_path
can be removed since it has the same behavior.
See: https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py#L3611
This is the only place where it is used as a string/path and in the next line it gets replaced anyway.
httpx==0.24.1 | ||
pillow-avif-plugin==1.4.3 | ||
GitPython==3.1.43 | ||
Pillow==10.3.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if PIL is upgreaded to >= 10.0 then some things will have to be re-witten
for example
similar issues can exist in other dependencies
need lots of testing
Description
Ubuntu 24.04 comes with python 3.12 and a newer rustc. This requires to update torch to 2.2+ which is available for py3.12 and also transformers to 4.34+ since only then tokenizers is new enough to build with the provided rustc.
Screenshots/videos:
Checklist: