Installing xFormers and Triton #4027
Rogala
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I don't know if Triton gives any advantages when it is installed, but when I check the value True when xFormers is installed, it warms my heart. xFormers speeds up generation, not significantly on my PC, but you may have different results.
The whole setup comes down to the fact that you need to update PyTorch to 2.7.0.
If you have the portable version before, you need to open the console (cmd) in the python_embedded folder and update torch and install xformers.
First, enter the nvidia-smi command to determine which version of cuda is installed on your PC (we are talking about Windows, under Linux I think there will be no problems). We need according to the description 12.6 or 12.8
If you have a cuda version not >=12.8, just change the number at the end of the cu128 link to cu126
Console commands in order:
This is the check line of the installation scripts, it will show you all the options that are available when working with xFormers on your system
.\python.exe -m xformers.info
If you have downloaded a version of Fooocus and are working with a virtual environment (.venv) to install Phyton libraries, the commands and actions look like this, open the console (cmd) in the folder where the virtual environment is located.
I think this will help in general systems where 60 steps take more than 1 minute, for others it will speed up no more than 1-2 seconds
Beta Was this translation helpful? Give feedback.
All reactions