-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: TokenCountingHandler remains 0 #12214
Comments
Based on the information you've provided and the context from the LlamaIndex repository, it seems like there might be an issue with how the The However, if the To troubleshoot this issue, consider the following steps:
Unfortunately, I wasn't able to find specific changes made to the If you could provide more details about the behavior you're experiencing or any additional context, it would be helpful in further diagnosing the issue. For example, are there any specific conditions or steps that lead to the Sources
|
its kind of jank --- try attaching it directly to the LLM too
|
Hi @logan-markewich, thanks for your reply! I changed to the following: token_counter = TokenCountingHandler(
tokenizer=tiktoken.encoding_for_model("gpt-3.5-turbo").encode,
verbose=True, # set to true to see usage printed to the console
)
callback_manager = CallbackManager([token_counter])
Settings.callback_manager = callback_manager
Settings.llm = OpenAI(model="gpt-3.5-turbo", temperature=0.7, callback_manager=callback_manager) Unfortunately, no luck.. |
So weird. OK one more thing
|
Thanks for the help, I tried, and unfortunately still all zeroes! Very weird behaviour.. I'm now using the legacy ServiceContext object to make it work. |
Any progress on it? I am having the same problem |
Same issue here. |
Same issue, my first invocation will show a zero count, subsequent calls will be correctly updated to the token count have counted fine. |
Bug Description
Hi! Love this project, and it's a blessing to work with.
I ran into a small problem. When implementing the new
TokenCountingHandler
through the new globalSetting
, the tokens remain 0. It does not return any warning or error.What am I missing? Any help is much appreciated 🙏
Version
0.10.23
Steps to Reproduce
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: