diff --git a/.changeset/moody-eggs-destroy.md b/.changeset/moody-eggs-destroy.md new file mode 100644 index 0000000000..e078bdd334 --- /dev/null +++ b/.changeset/moody-eggs-destroy.md @@ -0,0 +1,5 @@ +--- +"@llamaindex/doc": patch +--- + +docs: update chat engine docs diff --git a/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx b/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx index d794963376..f75b0fc158 100644 --- a/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx +++ b/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx @@ -12,9 +12,26 @@ const chatEngine = new ContextChatEngine({ retriever }); const response = await chatEngine.chat({ message: query }); ``` +In short, you can use the chat engine by calling `index.asChatEngine()`. It will return a `ContextChatEngine` to start chatting. + +```typescript +const chatEngine = index.asChatEngine(); +``` + +You can also pass in options to the chat engine. + +```typescript +const chatEngine = index.asChatEngine({ + similarityTopK: 5, + systemPrompt: "You are a helpful assistant.", +}); +``` + + The `chat` function also supports streaming, just add `stream: true` as an option: ```typescript +const chatEngine = index.asChatEngine(); const stream = await chatEngine.chat({ message: query, stream: true }); for await (const chunk of stream) { process.stdout.write(chunk.response);