Skip to content

Commit

Permalink
add doc
Browse files Browse the repository at this point in the history
  • Loading branch information
ZingLix committed Feb 1, 2024
1 parent 3f6c08b commit 96c61ce
Show file tree
Hide file tree
Showing 3 changed files with 48 additions and 21 deletions.
32 changes: 32 additions & 0 deletions docs/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,9 @@ $ qianfan [OPTIONS] COMMAND [ARGS]...
* `chat` 对话
* `completion` 补全
* `txt2img` 文生图
* `plugin` 插件
* `dataset` 数据集
* `evalutaion` 评估

### chat 对话

Expand All @@ -53,6 +55,12 @@ $ qianfan chat [OPTIONS]
* `--debug`:调试模式,会打印请求相关的原始信息。
* `--help`:展示帮助文档

在对话进行过程中,可以通过输入命令实现如下功能:

* `/reset`:重置对话,清空对话历史
* `/exit`:结束对话
* `/help`:展示帮助信息

### completion 补全

![completion](./imgs/cli/completion.gif)
Expand Down Expand Up @@ -104,6 +112,30 @@ $ qianfan txt2img [OPTIONS] PROMPT
* `--debug`:调试模式,会打印请求相关的原始信息。
* `--help`:展示帮助文档

### plugin 插件

**用法**:

```console
$ qianfan plugin [OPTIONS]
```

**Options 选项**:

* `--endpoint TEXT`:千帆插件的 endpoint [required]
* `--multi-line / --no-multi-line`:多行模式,提交时需要先按下 Esc 再回车,以避免与文本换行冲突 [default:no-multi-line]
* `--plugins`:启用的插件列表,通过 `,` 分隔不同的插件,例如 `uuid-zhishiku,uuid-chatocr,uuid-weatherforecast`
* `--debug`:调试模式,会打印请求相关的原始信息
* `--bos-path`:BOS 路径,用于上传文件
* `--help`:展示帮助文档

在对话进行过程中,可以通过输入命令实现如下功能:

* `/image [file_path]`:上传图片并附加至对话中,`file_path` 可以是网络上的链接,也可以是本地文件路径。其中,本地文件会被上传至 BOS 路径,因此需要提供 `bos-path` 参数。
* `/reset`:重置对话,清空对话历史
* `/exit`:结束对话
* `/help`:展示帮助信息

### dataset 数据集

![](./imgs/cli/dataset.webp)
Expand Down
15 changes: 13 additions & 2 deletions src/qianfan/common/client/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ class ChatClient(object):
END_PROMPT = "/exit"
RESET_PROMPT = "/reset"
HELP_PROMPT = "/help"
command_list = [END_PROMPT, RESET_PROMPT]
command_list = [END_PROMPT, RESET_PROMPT, HELP_PROMPT]
input_completer = WordCompleter(command_list, sentence=True)

def __init__(
Expand Down Expand Up @@ -186,6 +186,14 @@ def print_hint_msg(self) -> None:
" '--multi-line' option."
)

def print_help_message(self) -> None:
"""
Print command introduction
"""
rprint(f"[bold green]{self.END_PROMPT}[/]: End the conversation")
rprint(f"[bold green]{self.RESET_PROMPT}[/]: Reset the conversation")
rprint(f"[bold green]{self.HELP_PROMPT}[/]: Print this message")

def chat_in_terminal(self) -> None:
"""
Chat in terminal
Expand Down Expand Up @@ -213,10 +221,13 @@ def chat_in_terminal(self) -> None:
if message == self.END_PROMPT:
rprint("Bye!")
raise typer.Exit()
if message == self.RESET_PROMPT:
elif message == self.RESET_PROMPT:
self.msg_history = [QfMessages() for _ in range(len(self.clients))]
rprint("Chat history has been cleared.")
continue
elif message == self.HELP_PROMPT:
self.print_help_message()
continue

for i in range(len(self.clients)):
msg_history = self.msg_history[i]
Expand Down
22 changes: 3 additions & 19 deletions src/qianfan/common/client/plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,8 +157,8 @@ def print_help_message(self) -> None:
rprint(f"[bold green]{self.END_PROMPT}[/]: End the conversation")
rprint(f"[bold green]{self.RESET_PROMPT}[/]: Reset the conversation")
rprint(
f"[bold green]{self.IMAGE_PROMPT}[/]: Attach a local image to the"
" conversation [dim](e.g. /image car.jpg)[/]"
f"[bold green]{self.IMAGE_PROMPT} <file_path>[/]: Attach a local image to"
" the conversation [dim](e.g. /image car.jpg)[/]"
)
rprint(f"[bold green]{self.HELP_PROMPT}[/]: Print this message")

Expand Down Expand Up @@ -270,13 +270,6 @@ def chat_in_terminal(self) -> None:

@credential_required
def plugin_entry(
model: Optional[str] = typer.Option(
None,
help=(
"Model name of plugin. EBPluginV2 will be used if not specified. Currently"
" not available but will be supported in the future."
),
),
endpoint: Optional[str] = typer.Option(
...,
help="Endpoint of the plugin.",
Expand Down Expand Up @@ -332,19 +325,12 @@ def plugin_entry(
help="Stop words. Use comma to split multiple stop words.",
rich_help_panel=MODEL_ARGUMENTS_PANEL,
),
disable_search: Optional[bool] = typer.Option(
None, help="Disable search", rich_help_panel=MODEL_ARGUMENTS_PANEL
),
enable_citation: Optional[bool] = typer.Option(
None, help="Enable citation", rich_help_panel=MODEL_ARGUMENTS_PANEL
),
) -> None:
"""
Chat with the LLM with plugins in the terminal.
"""
qianfan.disable_log()
if model is None and endpoint is None:
model = DefaultLLMModel.ChatCompletion
model = None

extra_args = {}

Expand All @@ -356,8 +342,6 @@ def add_if_not_none(key: str, value: Any) -> None:
add_if_not_none("top_p", top_p)
add_if_not_none("penalty_score", penalty_score)
add_if_not_none("system", system)
add_if_not_none("disable_search", disable_search)
add_if_not_none("enable_citation", enable_citation)

if stop is not None:
extra_args["stop"] = stop.split(",")
Expand Down

0 comments on commit 96c61ce

Please sign in to comment.