Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[voice] Add length limit to TTS handled by cache #3699

Merged
merged 2 commits into from
Jul 15, 2023

Conversation

dalgwen
Copy link
Contributor

@dalgwen dalgwen commented Jul 12, 2023

During the TTS cache development, there was some discussion about adding a boolean to enable / disable cache per request. We didn't keep that because it makes the API complicated.
But time passed and new usages arise, and maybe it now can be discussed, on a new form ?

For example, there is now a ChatGPT binding. It is possible to make some advanced chat/queries with a LLM. Most of the time, the chat response are lengthy. And so one "discussion" with a LLM can clog the LRU cache and removes older entries worth keeping.

Proposal :
We can use the TTS length as a simple way to differentiate TTS sentences that should be cached, automatically.
TTS we should cache are those that can be repeated, and they are nearly always pretty short. examples :
"Ok, I'll do that"
"Hello, what can I do for you ?",
"You've got a message"
"Please close the garage door".

On the other side, one can safely assume that long TTS are generated for report of some kind (meteo, chatGPT ?), probably not meant to be repeated. They are also the ones taking the most space in the cache.

I put a 150 default character limit (configurable). To be discussed if the whole idea worth something.

We can safely assume that long TTS are generated for report (meteo, chatGPT ?), probably not meant to be repeated, and so not cached.

Signed-off-by: Gwendal Roulleau <[email protected]>
@dalgwen dalgwen requested a review from a team as a code owner July 12, 2023 11:53
Copy link
Contributor

@lolodomo lolodomo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lolodomo
Copy link
Contributor

Looks to me like a very good idea.

Copy link
Member

@J-N-K J-N-K left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general LGTM. Is the limit in characters? Maybe add that to the description.

@J-N-K J-N-K added the enhancement An enhancement or new feature of the Core label Jul 14, 2023
Apply code review

Signed-off-by: Gwendal Roulleau <[email protected]>
@dalgwen
Copy link
Contributor Author

dalgwen commented Jul 15, 2023

Thank you both for your reviews !

Copy link
Member

@J-N-K J-N-K left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks

@J-N-K J-N-K added this to the 4.0 milestone Jul 15, 2023
@J-N-K J-N-K merged commit 07e8823 into openhab:main Jul 15, 2023
2 checks passed
@dalgwen dalgwen deleted the tts_cache_add_length_limit branch July 15, 2023 14:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An enhancement or new feature of the Core
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants