Skip to content

Actions: cooleel/vllm

ruff

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
11 workflow runs
11 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[Core] Support Lark grammars for XGrammar (#10870)
ruff #11: Commit 8b59631 pushed by cooleel
December 6, 2024 16:13 26s main
December 6, 2024 16:13 26s
[Bugfix] Fix using -O[0,3] with LLM entrypoint (#10677)
ruff #10: Commit 9a99273 pushed by cooleel
November 26, 2024 19:05 25s main
November 26, 2024 19:05 25s
November 17, 2024 20:36 21s
[V1] Support per-request seed (#9945)
ruff #8: Commit 1f1b6d6 pushed by cooleel
November 3, 2024 17:27 27s main
November 3, 2024 17:27 27s
[bugfix] fix tsts (#9959)
ruff #7: Commit 3bb4bef pushed by cooleel
November 3, 2024 15:22 29s main
November 3, 2024 15:22 29s
October 31, 2024 17:59 27s
October 30, 2024 13:21 29s
[Bugfix] Fix prefix strings for quantized VLMs (#9772)
ruff #4: Commit bc73e98 pushed by cooleel
October 29, 2024 23:09 29s main
October 29, 2024 23:09 29s
[Bugfix] Fix ray instance detect issue (#9439)
ruff #3: Commit 2adb440 pushed by cooleel
October 28, 2024 14:02 32s main
October 28, 2024 14:02 32s
[Hardware][ROCM] using current_platform.is_rocm (#9642)
ruff #2: Commit 4e2d95e pushed by cooleel
October 28, 2024 04:44 26s main
October 28, 2024 04:44 26s
October 27, 2024 01:23 28s