Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMD install does not detect Rocm and just errors out asking for CUDA #1058

Open
BrechtCorbeel opened this issue Dec 21, 2024 · 5 comments
Open
Labels
bug Something isn't working

Comments

@BrechtCorbeel
Copy link

What happened?

I switched my 4090 to my Linux server and added my 7900XTX to my main machine as the 4090 just works and runs better, even with the AMD install for Comfy it did not run on Linux cannot get it to run on Windows either, I just get stuck being asked for CUDA:

Traceback (most recent call last):
File "W:\smatrix\Data\Packages\ComfyUIAMD\main.py", line 132, in
import execution
File "W:\smatrix\Data\Packages\ComfyUIAMD\execution.py", line 13, in
import nodes
File "W:\smatrix\Data\Packages\ComfyUIAMD\nodes.py", line 22, in
import comfy.diffusers_load
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\sd.py", line 6, in
from comfy import model_management
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 145, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
File "W:\smatrix\Data\Packages\ComfyUIAMD\comfy\model_management.py", line 114, in get_torch_device
return torch.device(torch.cuda.current_device())
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init_.py", line 878, in current_device
lazy_init()
File "W:\smatrix\Data\Packages\ComfyUIAMD\venv\lib\site-packages\torch\cuda_init
.py", line 305, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Steps to reproduce

No response

Relevant logs

No response

Version

v 2.14.4

What Operating System are you using?

Windows

@BrechtCorbeel BrechtCorbeel added the bug Something isn't working label Dec 21, 2024
@BrechtCorbeel BrechtCorbeel changed the title AMD install does not detect mROC and just errors out asking for CUDA AMD install does not detect Rocm and just errors out asking for CUDA Dec 21, 2024
@brknsoul
Copy link

ROCm (AMD's version of CUDA) is not available for Windows.

@BrechtCorbeel
Copy link
Author

ROCm (AMD's version of CUDA) is not available for Windows.

Didn't work for me on Linux either, but rocM is installable on Windows.

I got a new pc odered which I will install my AMD card onto and try again.

U had issue with Ubuntu 24 is there a reason or something I should know when picking OS version?

@Sersys
Copy link

Sersys commented Feb 7, 2025

I'm having the same issue on my RX 7900 XT on linux. I couldn't get Comfy_ui working on previous PyTorch version because of the "Attempting to use hipBLASLt on a unsupported architecture!" bug. I was suggested to update PyTorch and now it's looking for Nvidia drivers. Ironically the only solution I've found so far is to downgrade PyTorch in which case I'm facing the hipBLASLt issue again.

@brknsoul
Copy link

brknsoul commented Feb 7, 2025

There's a ZLUDA fork of ComfyUI here; https://github.com/patientx/ComfyUI-Zluda

I've written a guide if you want to install it manually.

(EDIT: thanks @mohnjiles, i didn't know it was available on SM.)

@mohnjiles
Copy link
Contributor

mohnjiles commented Feb 7, 2025

There's a ZLUDA fork of ComfyUI here; https://github.com/patientx/ComfyUI-Zluda

I've written a guide to install it; it's not installable via SM.

This has been available in SM since v2.13.0. If you aren't seeing it, try enabling the Show All Packages option or make sure you're on the latest version.

For the Linux users, we'll have some fixes in the next 2.13.4 for the ROCm stuff.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants