-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting 500 Internal Server Error
/ Exception in ASGI application
#14
Comments
Thank you very much for providing such detailed information and description in your issue. |
Thank you for you prompt response @jstzwj ! |
Hey @jstzwj , I'm still getting this error: latest commit from the dev branch
Client logs
Client env config
Olah logs
Olah env
|
I noticed a difference between the logs above and mine: my request was sent to 127.0.0.1, while yours was sent to 172.16.50.12. This difference might have triggered a bug at |
Thanks @jstzwj !
Client
nginx servernginx server config is here
olah server
|
And when directly accessing
|
I don't know the reason for sure. It happens occasionally when I pull very large repository. |
Thank you @jstzwj , I'll be looking forward to test the new version. You can also just try exact same model https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 -- it is gated but you can request the access to it. |
I just attempted to optimize and fix some bugs in the caching. The new commit resolved the |
Hey @jstzwj , I figured that I have been getting 403 errors due to something stale under my I think you might want to make some improvement to the olah so it doesn't require a user to manually clear the Thank you! Update |
When downloading files from huggingface's LFS, a 302 redirect occurs from huggingface.co to the cdn-lfs.huggingface.co. The ./hf_mirrors/heads directory stores records of HTTP redirects for the mirror site. The redirect location generated by older versions Olah is incorrect, resulting in HTTP 403 errors. |
It would be great if |
Hey @jstzwj , we are constantly getting a vllm command is
|
I realized that I have not implemented the method "/api/{repo_type}/{org_repo}/tree/{commit}", which led to some issues. However, I am unsure why vllm requests /meta-llama/Meta-Llama-3.1-405B-Instruct-FP8/resolve/main/added_tokens.json. I think there is no such file in this repository. |
I just pushed a new commit to address this issue, which includes changes to the format of caching. Therefore, when using the latest commit (5b74eba), please delete the |
great job @jstzwj ! Thank you for your prompt responses and lightning fast development! 🦾 |
Here is a small write-up I've build and use it:
The text was updated successfully, but these errors were encountered: