Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc][Setup] Change python version and vllm version #103

Merged
merged 1 commit into from
Feb 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/Quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Requirements

Llumnix requires python `3.8.1~3.10.0` and is currently built on top of vLLM (version 0.4.2). Therefore, the installation requirements are almost identical to those of vLLM. You can view the specific installation requirements for vLLM at the following link:
Llumnix requires python `3.9~3.10` and is currently built on top of vLLM (version 0.6.3.post1). Therefore, the installation requirements are almost identical to those of vLLM. You can view the specific installation requirements for vLLM at the following link:

[vLLM Installation](https://docs.vllm.ai/en/v0.6.3.post1/getting_started/installation.html)

Expand Down
2 changes: 1 addition & 1 deletion docs/Supported_Models.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

Llumnix serves as the request scheduling layer on top of the LLM backend engines. Therefore, all the models supported by the backend LLM engine should ideally be supported by Llumnix. We are also conducting full compatibility tests on different models.

Currently, Llumnix is developed on top of the vLLM (version 0.4.2), making its supported models identical to those of vLLM. Up to now, our primary testing of Llumnix has been conducted on Qwen and Llama models, including:
Currently, Llumnix is developed on top of the vLLM (version 0.6.3.post1), making its supported models identical to those of vLLM. Up to now, our primary testing of Llumnix has been conducted on Qwen and Llama models, including:

- Llama
- Llama2
Expand Down
3 changes: 1 addition & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def readme():
setup(
name='llumnix',
version='0.0.2',
python_requires='>=3.8.1, <3.11',
python_requires='>=3.9.0, <3.11',
description='Efficient and easy multi-instance LLM serving',
long_description=readme(),
long_description_content_type="text/markdown",
Expand All @@ -49,7 +49,6 @@ def readme():
platforms=["all"],
classifiers=[
'Programming Language :: Python',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
"License :: OSI Approved :: Apache Software License",
Expand Down