Replies: 2 comments
-
@dosu-bot |
Beta Was this translation helpful? Give feedback.
-
Hey @Munken! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together! To get log probabilities (logprobs) from a query index using the LlamaIndex framework, you can use the
Here's a code snippet illustrating this: def extract_logprobs(response) -> Optional[List[List[LogProb]]]:
if response.choices[0].logprobs is not None:
choice_logprobs = []
for choice in response.choices:
choice_logprobs.append(extract_logprobs_choice(choice))
else:
choice_logprobs = None
return choice_logprobs This function is typically used in conjunction with a completion request, as shown in the @llm_completion_callback()
def complete(
self, prompt: str, formatted: bool = False, **kwargs: Any
) -> CompletionResponse:
# ... (setup code)
completion_response = self._runner.send_chat_completion_request(request)
return CompletionResponse(
text=completion_response.choices[0].message.content,
logprobs=extract_logprobs(completion_response),
) This setup sends a completion request and extracts the log probabilities from the response using |
Beta Was this translation helpful? Give feedback.
-
@dosu-bot
How do I get logprobs from a query index?
Beta Was this translation helpful? Give feedback.
All reactions