You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@FelipeAdachi In my understanding, the hallucination detection works by prompting the LLM with a default prompt inside the langkit library.
My question is shall we use our own custom prompt for this, rather than go with the default prompt?
The hallucination detection will call the LLM in two distinct phases:
To generate the additional samples, based on the prompt passed
To perform the consistency check between the answer and generated samples
I'm assuming you want to pass your own custom prompt for the second item, correct? If that's the case, no, we currently don't support it.
Right now, the code expects an output between [Accurate, Minor Inaccurate, Major Inaccurate], and assigns scores for each of these 3 values. To support a custom prompt, we'd need to require the same output, or maybe some sort of mapping of categories to values.
Can you share more details on why do you need a custom prompt for your use case?
How can I use my own domain specific prompt for response hallucination detection function call in langkit?
The text was updated successfully, but these errors were encountered: