llama-cpp-python: Support for logprobs
.
#1039
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains:
What is the current behavior? (You can also link to an open issue here)
Documentation indicates llama-cpp-python does not support logprobs. We did this because at the time it returned logpobs in the wrong format which broke things.
See previous discussion with jjallaire in #666 (review).
What is the new behavior?
Update the documentation to state that llama-cpp-python supports logprobs.
This is correct, now that abetlen/llama-cpp-python#1788 has been merged and released in llama-cpp-python 0.3.5.
NB: No 'real' code changes are needed, as I already set it up for logprobs previously. Now that llama-cpp-python returns them correctly when set, it's just a docs changed needed effectively.
Does this PR introduce a breaking change? (What changes might users need to make in their application due to this PR?)
No