-
-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ROCM GPU support #66
Comments
We're refactoring a few things to more easily support different architectures, though I haven't worked with AMD cards yet. Will try adding them to the onnx providers but may need some community testing to get right |
Would ONNX work on the intel Arc GPUs? |
I've got it to work with rocm. The problem is that the docker image becomes huge: Around 22G when I use https://github.com/bgs4free/Kokoro-FastAPI/tree/add-rocm-support |
|
What version of rocm did you get it working with I have a 5700xt and that thing only supports rocm 5.2 unofficially. Any chance you could get it working on that? Or maybe get direct-ml working with the docker file. |
The I'm using the pytorch index for 6.2, because I had issues with the 6.3 version (version mismatch seems to be no issue AFAIK). Change it to 5.2, try and see if it works. Make sure you have enough disk space.
|
Okay I tried that and it gave me the following error Acording to this thread https://stackoverflow.com/questions/73229163/amd-rocm-with-pytorch-on-navi10-rx-5700-rx-5700-xt you have to Set the environment variable.HSA_OVERRIDE_GFX_VERSION=10.3.0 I just added the variable to the docker file under env. Any idea what else i could try? |
devices get passed through from the host. See: https://github.com/bgs4free/Kokoro-FastAPI/blob/add-rocm-support/docker/rocm/docker-compose.yml#L20 What is your host system? This was tested on Linux and I can't speak for any other OS. |
Oh, okay its windows. I thought that wouldn't reaööy matter since the docker runs an Ubuntu image. |
https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/docker.html Don't see anything about docker for windows here. |
I checked my docker config it is running through wsl so that is technically Ubuntu so it should work. |
Taking a shot at this shortly, will post links if anyone is able to test it out (I don't have an AMD card unfortunately, so can just skeleton it up) |
My fork works on my linux machine. Didn't make a PR, though, because I personally didn't find it convincing enough to pursue. Please feel free to use my stuff if you see any value. |
But for Arc I already implemented a version with ipex (intel extention for pytorch) support that uses nearly the exact code as CUDA GPUs For more on Arc please use Issue#106 |
Hey, it'd be really nice if we could run this model API on AMD cards, and it would provide a very nice speedup over CPU. Do you think you could add this when you have some free time? It shouldn't be too hard since afaik ONNX models can run with ROCM just fine.
The text was updated successfully, but these errors were encountered: