You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How to obtain the tokens consumption of a model through api/search API access, such as using an openai model, and how to obtain the model token consumed by a single request, such as:
How to obtain the tokens consumption of a model through api/search API access, such as using an openai model, and how to obtain the model token consumed by a single request, such as:
"usage": {
"prompt_tokens": 13,
"completion_tokens": 7,
"total_tokens": 20,
"completion_tokens_details": {
"reasoning_tokens": 0,
"accepted_prediction_tokens": 0,
"rejected_prediction_tokens": 0
}
}
I need to obtain prompt_token and perfection_token. How can I modify the source code to obtain this information?
The text was updated successfully, but these errors were encountered: