Skip to content

Actions: NVIDIA/TensorRT-LLM

Blossom-CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
483 workflow runs
483 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

openai_server error
Blossom-CI #308: Issue comment #2357 (comment) created by Justin-12138
January 6, 2025 02:40 5s
January 6, 2025 02:40 5s
qserve is slower then awq int4 for llama2-7b on H100
Blossom-CI #304: Issue comment #2509 (comment) created by KKwanhee
January 4, 2025 13:23 4s
January 4, 2025 13:23 4s
[Feature Request] Better support for w4a8 quantization
Blossom-CI #303: Issue comment #2605 (comment) created by ShuaiShao93
January 4, 2025 00:13 5s
January 4, 2025 00:13 5s
Update linux.md
Blossom-CI #302: Issue comment #2653 (comment) created by nv-guomingz
January 3, 2025 08:01 3s
January 3, 2025 08:01 3s
Adding custom sampling config
Blossom-CI #301: Issue comment #2609 (comment) created by buddhapuneeth
January 2, 2025 19:04 5s
January 2, 2025 19:04 5s
How to suppress the WARNING logging?
Blossom-CI #300: Issue comment #2610 (comment) created by yuekaizhang
January 2, 2025 03:21 4s
January 2, 2025 03:21 4s
support qwen2.5 models
Blossom-CI #299: Issue comment #2336 (comment) created by yusufcakmakk
January 1, 2025 19:55 4s
January 1, 2025 19:55 4s
Support for DeepseekV2ForCausalLM
Blossom-CI #298: Issue comment #2340 (comment) created by dominicshanshan
December 31, 2024 05:28 4s
December 31, 2024 05:28 4s
Docker image
Blossom-CI #297: Issue comment #52 (comment) created by Fatemehkiasaveh
December 31, 2024 04:46 5s
December 31, 2024 04:46 5s
How to make it not display info information?executorExampleBasic.cpp
Blossom-CI #296: Issue comment #2637 (comment) created by chuangz0
December 31, 2024 04:46 4s
December 31, 2024 04:46 4s
greedy search results vary with each inference
Blossom-CI #295: Issue comment #2640 (comment) created by zhangts20
December 31, 2024 04:38 4s
December 31, 2024 04:38 4s
greedy search results vary with each inference
Blossom-CI #294: Issue comment #2640 (comment) created by fclearner
December 31, 2024 02:19 5s
December 31, 2024 02:19 5s
Feature Request: Add Min-P sampling layer
Blossom-CI #293: Issue comment #1154 (comment) created by aikitoria
December 31, 2024 00:09 4s
December 31, 2024 00:09 4s
Support for DeepseekV2ForCausalLM
Blossom-CI #292: Issue comment #2340 (comment) created by Pernekhan
December 30, 2024 19:46 5s
December 30, 2024 19:46 5s
Feature Request: Add Min-P sampling layer
Blossom-CI #291: Issue comment #1154 (comment) created by aikitoria
December 30, 2024 14:07 5s
December 30, 2024 14:07 5s
support for T4
Blossom-CI #290: Issue comment #2620 (comment) created by krishnanpooja
December 30, 2024 10:00 5s
December 30, 2024 10:00 5s
Unable to install TensorRT-LLM
Blossom-CI #289: Issue comment #2597 (comment) created by zhangts20
December 28, 2024 02:03 5s
December 28, 2024 02:03 5s
Troubleshoot mistral model
Blossom-CI #288: Issue comment #2632 (comment) created by zhangts20
December 28, 2024 01:54 5s
December 28, 2024 01:54 5s
Wrong result when using lora on multi gpus
Blossom-CI #287: Issue comment #2589 (comment) created by ShuaiShao93
December 28, 2024 00:14 3s
December 28, 2024 00:14 3s
Wrong result when using lora on multi gpus
Blossom-CI #286: Issue comment #2589 (comment) created by ShuaiShao93
December 27, 2024 23:11 4s
December 27, 2024 23:11 4s
Wrong result when using lora on multi gpus
Blossom-CI #285: Issue comment #2589 (comment) created by ShuaiShao93
December 27, 2024 22:22 4s
December 27, 2024 22:22 4s
Wrong result when using lora on multi gpus
Blossom-CI #284: Issue comment #2589 (comment) created by ShuaiShao93
December 27, 2024 20:00 4s
December 27, 2024 20:00 4s