Way to much context #1179
Replies: 3 comments
-
Hi @jaluki96 I have not documented this well yet but there is an 'experimental' feature to summarize the context as you go along. By default, it will summarize every message incrementally as soon as you reach half of the model's context window. For gpt-4-1106, it can still be quite large (64K), so I will add a setting soon to adjust it, along with proper documentation. in your env file
|
Beta Was this translation helpful? Give feedback.
-
@jaluki96 An option to what @danny-avila suggested (here I am 3 weeks later) could be using https://conturata.com/ai/chunker. I use it occasionally myself. Disclaimer: I am in no way affiliated with anyone involved with Conturata. ![]() |
Beta Was this translation helpful? Give feedback.
-
Thanks for the suggestion! I will provide a configurable way to limit max token context soon |
Beta Was this translation helpful? Give feedback.
-
Hello, i'm an absolute beginner in programming but got LibreChat to work. My context got quite big to such an amount that the context size is to big for the newestest GPT4 version. Is there a possibility to reduce the context size to not the whole conversation or to "clean" it so i can use it again? Also is there a possibility to set a max context because it got quite expensive in a short amount of time because of the context size.
Beta Was this translation helpful? Give feedback.
All reactions