Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM output error #259

Open
ha3ketr0x opened this issue Sep 16, 2024 · 2 comments
Open

LLM output error #259

ha3ketr0x opened this issue Sep 16, 2024 · 2 comments

Comments

@ha3ketr0x
Copy link

Hi,
sometimes the transcribe is throwing no llm output or result error.

Along with that sometimes the output is stable and not changing, For example, if someone is asking anything like what is pc and suddenly asks what is kubernetes keys. It is not moving forward to answer other keys question. Checked multiple times, I am not clicking on question.
I am on latest transcribe.

@mang0sw33t
Copy link
Collaborator

Please share a screenshot of the application and some description of expected behavior and observed behavior with respect to the screenshot.
A screenshot of the command line used to start transcribe might be useful as well.

@ha3ketr0x
Copy link
Author

Hi,

I am not facing this now, but it happens sometimes even with api credit.
It sometimes shows no llm found and along with that, sometimes after listening to questions even If I am not clicking any question to stop the answer. It is stopped and never move forward . Even after asking questions, we can see the questions but not answers. and I am also not clicking on do not suggest at that time.

Along with that sometimes No LLM found sometimes might be its because of api outage.

along with that, is there any way to stop the answer after every question and we need to click to listen again.

Also, I have tried to copy the below in override .yaml but its throwing me a error in command prompt: and could you guide how to use perplexity api in transcribe... Sorry for two many bombarding questions at once.-

default_prompt_preamble: "You are a casual pal, genuinely interested in the conversation at hand. A poor transcription of conversation is given below. "
default_prompt_epilogue: "Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

The combination of system_prompt, initial_convo is used to create a multi turn prompt message for LLM.

system_prompt_1, systen_prompt_2 are here as samples of other possible prompts.

Only the content of system_prompt parameter will be used

system_prompt: "You are a casual pal, genuinely interested in the conversation at hand. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: "You are an expert at Basketball and helping others learn about basketball. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: "You are an expert at Fantasy Football and helping others learn about Fantasy football. Please respond, in detail, to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

system_prompt: “You are an expert Agile Coach and are interviewing for a position. Respond in detail to the conversation. Confidently give a straightforward response to the speaker, even if you don't understand them. Give your response in square brackets. DO NOT ask to repeat, and DO NOT ask for clarification. Just answer the speaker directly."

summary_prompt: 'Create a summary of the following text'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants