You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the text chunk sizes and maximum tokens in a response are hardcoded, meaning...
Requests aren't optimized to use the maximum allowed tokens per model
Only a maximum of 120,000 characters can be used as input
I can explore using tiktoken to calculate tokens prior to submitting requests, which would allow for more intelligent batching and recursive summarization
The text was updated successfully, but these errors were encountered:
Currently, the text chunk sizes and maximum tokens in a response are hardcoded, meaning...
I can explore using tiktoken to calculate tokens prior to submitting requests, which would allow for more intelligent batching and recursive summarization
The text was updated successfully, but these errors were encountered: