From 1a401252b5c1cb6cab348531281c5bd340257733 Mon Sep 17 00:00:00 2001 From: Woosuk Kwon Date: Mon, 13 Jan 2025 17:24:36 -0800 Subject: [PATCH] [Docs] Add Sky Computing Lab to project intro (#12019) Signed-off-by: Woosuk Kwon --- README.md | 2 ++ docs/source/index.md | 2 ++ 2 files changed, 4 insertions(+) diff --git a/README.md b/README.md index e24e1a227cf40..658b9fb6edd8c 100644 --- a/README.md +++ b/README.md @@ -38,6 +38,8 @@ The first vLLM meetup in 2025 is happening on January 22nd, Wednesday, with Goog ## About vLLM is a fast and easy-to-use library for LLM inference and serving. +Originally developed in the [Sky Computing Lab](https://sky.cs.berkeley.edu) at UC Berkeley, vLLM has evloved into a community-driven project with contributions from both academia and industry. + vLLM is fast with: - State-of-the-art serving throughput diff --git a/docs/source/index.md b/docs/source/index.md index 8f9493d77186e..d7a1117df9c27 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -23,6 +23,8 @@ vLLM is a fast and easy-to-use library for LLM inference and serving. +Originally developed in the [Sky Computing Lab](https://sky.cs.berkeley.edu) at UC Berkeley, vLLM has evloved into a community-driven project with contributions from both academia and industry. + vLLM is fast with: - State-of-the-art serving throughput