-
-
Notifications
You must be signed in to change notification settings - Fork 775
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use llm.Chat (OpenAI) with only one existing embedder for OpenAI #372
Comments
A chat just defers to its underlying client to do an embedding: https://github.com/tmc/langchaingo/blob/51a3a0a0f54a/llms/openai/openaillm_chat.go#L127 There's a huge amount of duplication between There are multiple approaches to solve this without adding even more duplication. For example, we could add an interface for types that provide |
... another option is just to let each |
yeah, that could resolve the issue, we should revert back the PR #362, then once we have better approach to satisfy the needs, we could do the clean up (and remove the duplication), b/c at the moment all people that were using llm.Chat are blocked for updates for the newer changes from the lib. cc @tmc |
I can help with this a bit later today, if we don't have a simple enough forward fix I'm open to reverting for now. |
PTAL at #374 - this is a sketch of what I mean. It makes the Both Please try it and let me know if it works for you. This approach can help reduce a lot of duplication that currently exists in the |
Merged! Can we consider this resolved? |
I think that we are good. Will resolve this issue. Thanks both! |
SGTM, I created #379 to track the followup cleanup and refactoring work. |
Description
this PR embeddings: remove unnecessary mostly-duplicated implementations #362 removed the openai and vertexai
chat
embedders, with that we cannot anymore pass in the LLM created with theWithClient
option to the embedder init method, because openai.LLM (https://github.com/tmc/langchaingo/blob/main/llms/openai/llm.go) and openai.Chat LLM (https://github.com/tmc/langchaingo/blob/main/llms/openai/openaillm_chat.go) are two different LLMs.when we want to use openai.Chat LLM, we need to have proper embedder for it.
code example (chat LLM through Azure) with vectorstore:
this code above shows how we can use
llm.Chat
with it's embedder.The whole handicap is that now, without
"github.com/tmc/langchaingo/embeddings/openai/openaichat"
or the same for vertexai we cannot create the the embedder needed to be used in the llm.Chat or to pass it viaWithClient
llm option.@tmc @eliben, continuing the discussion from the merged PR above.
This is the main blocker for us/me and whoever wants to use the chat llm.
The text was updated successfully, but these errors were encountered: