Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Development environment fails to use LLM: ipex_llm can't import xe_addons #71

Closed
dnoliver opened this issue Oct 2, 2024 · 3 comments
Closed
Labels
bug Something isn't working

Comments

@dnoliver
Copy link
Contributor

dnoliver commented Oct 2, 2024

Describe the bug

Trying to use the LLM functionality from on the tip of the main branch fails.
The issue I am seeing in the log is similar to this: intel/ipex-llm#12143
The answer of the maintainer at intel/ipex-llm#12143 (comment) is that ipex-llm currently supports Pytorch 2.1.
This commit 25ef470#diff-8c59cb1e702fdc89c58380621c48e48ae5b7afc1b7ddde1c0b54d030d979f016R27 updated the Pytorch version to 2.3.

To Reproduce

Steps to reproduce the behavior:

  1. Deploy the development environment
  2. Go to the Answer tab, use the default model Microsoft Phy3
  3. Send a query to the model

Expected behavior

Model should load and reply

Screenshots

image

Log:

[ai-backend]: 2024-10-02 10:45:41,977 - INFO - Converting the current model to sym_int4 format......

[ai-backend]: C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\init.py:452: UserWarning: Initializing zero-element tensors is a no-op
  warnings.warn("Initializing zero-element tensors is a no-op")

[ai-backend]: Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.

[ai-backend]: Traceback (most recent call last):
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\service\llm_biz.py", line 69, in stream_chat_generate
    model.generate(**args)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\lookup.py", line 92, in generate
    return original_generate(self,
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\speculative.py", line 109, in generate
    return original_generate(self,
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\pipeline_parallel.py", line 281, in generate
    return original_generate(self,
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\transformers\generation\utils.py", line 1575, in generate
    result = self._sample(
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\transformers\generation\utils.py", line 2697, in _sample
    outputs = self(
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)

[ai-backend]:   File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 1243, in forward
    outputs = self.model(
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\models\phi3.py", line 282, in model_forward
    return origin_model_forward(
  File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 1121, in forward
    layer_outputs = decoder_layer(
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\.cache\huggingface\modules\transformers_modules\microsoft___Phi_3_mini_4k_instruct\modeling_phi3.py", line 839, in forward
    hidden_states = self.input_layernorm(hidden_states)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\Users\Nicolas Oliver\Downloads\AI-Playground\env\lib\site-packages\ipex_llm\transformers\models\phi3.py", line 339, in phi3_rms_norm_forward
    import xe_addons
ImportError: DLL load failed while importing xe_addons: The specified procedure could not be found.

Environment (please complete the following information):

  • OS: Windows11
  • GPU: [e.g. Intel Arc A770 16G]
  • CPU: Intel Core Ultra 7 155H
  • Version: tip of the main branch, commit d23a807

Additional context

Add any other context about the problem here.

@dnoliver dnoliver added the bug Something isn't working label Oct 2, 2024
@Nuullll
Copy link
Contributor

Nuullll commented Oct 8, 2024

Thanks for reporting this. Could you please try the dev branch? 4000965 downgraded IPEX to 2.1.40 for MTL.

@dnoliver
Copy link
Contributor Author

dnoliver commented Oct 8, 2024

Yeah same as in #70
Using the dev branch with downgraded versions worked fine

@Nuullll
Copy link
Contributor

Nuullll commented Oct 9, 2024

Fixed by #68

@Nuullll Nuullll closed this as completed Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants