From 94e2150f00dc34900973a7eb723f7198c16f4f6c Mon Sep 17 00:00:00 2001 From: Haijian Wang <130898843+Haijian06@users.noreply.github.com> Date: Fri, 26 Jul 2024 23:11:04 +0800 Subject: [PATCH] Add files via upload --- .../Inference/Inference_using_lmdeploy.ipynb | 98 +++++++++++++++++++ 1 file changed, 98 insertions(+) create mode 100644 Cookbook/cn/opensource/Inference/Inference_using_lmdeploy.ipynb diff --git a/Cookbook/cn/opensource/Inference/Inference_using_lmdeploy.ipynb b/Cookbook/cn/opensource/Inference/Inference_using_lmdeploy.ipynb new file mode 100644 index 00000000..68c542d8 --- /dev/null +++ b/Cookbook/cn/opensource/Inference/Inference_using_lmdeploy.ipynb @@ -0,0 +1,98 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 使用lmdeploy进行Yi-1.5-6B-Chat模型的推理\n", + "\n", + "欢迎来到本教程!在这里,我们将指导您如何使用lmdeploy进行Yi-1.5-6B-Chat模型的推理。lmdeploy是一个涵盖了LLM任务的全套轻量化、部署和服务解决方案。让我们开始吧!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 🚀 在Colab上运行\n", + "\n", + "我们还提供了一键运行的[Colab脚本](https://colab.research.google.com/drive/1q3ROpne6ulkoybBzemeanHY6vNP9ykjV?usp=drive_link),让开发变得更简单!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 安装\n", + "\n", + "首先,我们需要安装相关的依赖:" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "installation" + }, + "source": [ + "!pip install lmdeploy" + ], + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 加载模型并开始推理\n", + "\n", + "我们将使用Yi-1.5-6B-Chat模型进行演示。以下是该模型的显存和硬盘占用情况:\n", + "\n", + "| 模型 | 显存使用 | 硬盘占用 |\n", + "|-------|------------|------------------|\n", + "| Yi-1.5-6B-Chat | 20.3G | 18G |\n", + "\n", + "执行以下命令即可开始推理:\n", + "\n", + "⚠️ 请确保您的计算机拥有足够的显存和硬盘空间。" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "inference" + }, + "source": [ + "!lmdeploy chat 01-ai/Yi-1.5-6B-Chat" + ], + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "就是这样!您已经成功地使用lmdeploy进行了Yi-1.5-6B-Chat模型的推理。您可以尝试更换不同的模型或调整配置参数,探索更多可能性。祝您实验愉快!" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.8" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +}