Skip to content

Commit

Permalink
[Doc] Update FAQ links in spec_decode.rst (vllm-project#9662)
Browse files Browse the repository at this point in the history
Signed-off-by: whyiug <[email protected]>
Signed-off-by: Isotr0py <[email protected]>
  • Loading branch information
whyiug authored and Isotr0py committed Nov 8, 2024
1 parent c783237 commit 4775ad0
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/source/models/spec_decode.rst
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ speculative decoding, breaking down the guarantees into three key areas:
3. **vLLM Logprob Stability**
- vLLM does not currently guarantee stable token log probabilities (logprobs). This can result in different outputs for the
same request across runs. For more details, see the FAQ section
titled *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq.rst>`_.
titled *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq>`_.


**Conclusion**
Expand All @@ -197,7 +197,7 @@ can occur due to following factors:

**Mitigation Strategies**

For mitigation strategies, please refer to the FAQ entry *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq.rst>`_.
For mitigation strategies, please refer to the FAQ entry *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq>`_.

Resources for vLLM contributors
-------------------------------
Expand Down

0 comments on commit 4775ad0

Please sign in to comment.