Skip to content

Actions: AlibabaPAI/llumnix

offline_inference

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
484 workflow runs
484 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #434: Pull request #95 synchronize by s5u13b
February 8, 2025 03:16 56m 32s fix-migration
February 8, 2025 03:16 56m 32s
[WIP-BladeLLM] Support migration feature for BladeLLM
offline_inference #433: Pull request #90 synchronize by KuilongCui
February 7, 2025 12:10 1h 12m 40s migartion_blade
February 7, 2025 12:10 1h 12m 40s
[BugFix][Refactor] Fix bugs and refine codes for large scale simulator test
offline_inference #432: Pull request #93 synchronize by s5u13b
February 7, 2025 11:38 38m 58s request_timestamps
February 7, 2025 11:38 38m 58s
[WIP-BladeLLM] Support migration feature for BladeLLM
offline_inference #431: Pull request #90 synchronize by KuilongCui
February 7, 2025 11:35 9m 50s migartion_blade
February 7, 2025 11:35 9m 50s
[WIP-BladeLLM] Support migration feature for BladeLLM
offline_inference #430: Pull request #90 synchronize by KuilongCui
February 7, 2025 11:34 1m 20s migartion_blade
February 7, 2025 11:34 1m 20s
[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #429: Pull request #95 synchronize by s5u13b
February 7, 2025 09:41 6m 30s fix-migration
February 7, 2025 09:41 6m 30s
[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #428: Pull request #95 synchronize by s5u13b
February 7, 2025 08:09 12m 22s fix-migration
February 7, 2025 08:09 12m 22s
[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #427: Pull request #95 synchronize by s5u13b
February 7, 2025 07:01 42m 57s fix-migration
February 7, 2025 07:01 42m 57s
[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #426: Pull request #95 synchronize by s5u13b
February 7, 2025 06:54 6m 54s fix-migration
February 7, 2025 06:54 6m 54s
[BugFix] Fix request leaking bug after upgrading vllm to v0.6.3.post1
offline_inference #425: Pull request #95 synchronize by s5u13b
February 7, 2025 06:51 2m 49s fix-migration
February 7, 2025 06:51 2m 49s
[Core] Increase the instance type when scaling up llumlet (#87)
offline_inference #424: Commit bbfb6dd pushed by KuilongCui
February 7, 2025 02:01 6m 34s main
February 7, 2025 02:01 6m 34s
[Core] Increase the instance type when scaling up llumlet
offline_inference #423: Pull request #87 synchronize by KuilongCui
January 23, 2025 10:00 39m 43s engine_type
January 23, 2025 10:00 39m 43s
[Core] Increase the instance type when scaling up llumlet
offline_inference #422: Pull request #87 synchronize by KuilongCui
January 23, 2025 09:07 29m 53s engine_type
January 23, 2025 09:07 29m 53s
[Core] Upgrade vllm to v0.6.3.post1 (#69)
offline_inference #421: Commit 2a4822a pushed by KuilongCui
January 23, 2025 08:47 1h 44m 32s main
January 23, 2025 08:47 1h 44m 32s
[BugFix][Refactor] Fix bugs and refine codes for large scale simulator test
offline_inference #420: Pull request #93 synchronize by s5u13b
January 23, 2025 07:49 54m 51s request_timestamps
January 23, 2025 07:49 54m 51s
[Core] Upgrade vllm to v0.6.3.post1
offline_inference #419: Pull request #69 synchronize by ZeldaHuang
January 23, 2025 07:46 6m 50s vllm_upgrade
January 23, 2025 07:46 6m 50s
[Core] Upgrade vllm to v0.6.3.post1
offline_inference #418: Pull request #69 synchronize by ZeldaHuang
January 23, 2025 06:35 58m 50s vllm_upgrade
January 23, 2025 06:35 58m 50s
[BugFix][Refactor] Fix bugs and refine codes for large scale simulator test
offline_inference #417: Pull request #93 synchronize by s5u13b
January 22, 2025 07:21 24m 7s request_timestamps
January 22, 2025 07:21 24m 7s
[Core] Increase the instance type when scaling up llumlet
offline_inference #416: Pull request #87 synchronize by KuilongCui
January 21, 2025 12:24 27m 58s engine_type
January 21, 2025 12:24 27m 58s
[Core] Increase the instance type when scaling up llumlet
offline_inference #415: Pull request #87 synchronize by KuilongCui
January 21, 2025 11:10 31m 23s engine_type
January 21, 2025 11:10 31m 23s
[Core] Upgrade vllm to v0.6.3.post1
offline_inference #414: Pull request #69 synchronize by ZeldaHuang
January 21, 2025 10:48 6m 29s vllm_upgrade
January 21, 2025 10:48 6m 29s
[BugFix][Refactor] Fix bugs and refine codes for large scale simulator test
offline_inference #413: Pull request #93 synchronize by s5u13b
January 21, 2025 09:28 1m 54s request_timestamps
January 21, 2025 09:28 1m 54s
[Core] Increase the instance type when scaling up llumlet
offline_inference #412: Pull request #87 synchronize by KuilongCui
January 21, 2025 08:56 1m 52s engine_type
January 21, 2025 08:56 1m 52s
[Core] Increase the instance type when scaling up llumlet
offline_inference #411: Pull request #87 synchronize by KuilongCui
January 21, 2025 08:08 3m 5s engine_type
January 21, 2025 08:08 3m 5s
[Core] Increase the instance type when scaling up llumlet
offline_inference #410: Pull request #87 synchronize by KuilongCui
January 21, 2025 07:51 3m 2s engine_type
January 21, 2025 07:51 3m 2s