You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Go to the Answer tab, use the default model Microsoft Phy3
Send a query to the model
Expected behavior
Model should load and reply
Screenshots
Log:
[ai-backend]: 2024-10-02 10:45:41,977 - INFO - Converting the current model to sym_int4 format......
[ai-backend]: C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\init.py:452: UserWarning: Initializing zero-element tensors is a no-op
warnings.warn("Initializing zero-element tensors is a no-op")
[ai-backend]: Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[ai-backend]: Traceback (most recent call last):
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\service\llm_biz.py", line 69, in stream_chat_generate
model.generate(**args)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\lookup.py", line 92, in generate
return original_generate(self,
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\speculative.py", line 109, in generate
return original_generate(self,
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\pipeline_parallel.py", line 281, in generate
return original_generate(self,
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\transformers\generation\utils.py", line 1575, in generate
result = self._sample(
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\transformers\generation\utils.py", line 2697, in _sample
outputs = self(
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
[ai-backend]: File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 1243, in forward
outputs = self.model(
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\models\phi3.py", line 282, in model_forward
return origin_model_forward(
File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 1121, in forward
layer_outputs = decoder_layer(
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 839, in forward
hidden_states = self.input_layernorm(hidden_states)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\models\phi3.py", line 339, in phi3_rms_norm_forward
import xe_addons
ImportError: DLL load failed while importing xe_addons: The specified procedure could not be found.
Environment (please complete the following information):
Describe the bug
Trying to use the LLM functionality from on the tip of the main branch fails.
The issue I am seeing in the log is similar to this: intel/ipex-llm#12143
The answer of the maintainer at intel/ipex-llm#12143 (comment) is that ipex-llm currently supports Pytorch 2.1.
This commit 25ef470#diff-8c59cb1e702fdc89c58380621c48e48ae5b7afc1b7ddde1c0b54d030d979f016R27 updated the Pytorch version to 2.3.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Model should load and reply
Screenshots
Log:
Environment (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: