You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be very helpful if text responses could be streamed rather than requiring a full await. This feature would be especially beneficial for chat applications. For instance:
In the case of OpenAI, passing stream=True in the configuration is sufficient.
For AWS, the converse_stream() function needs to be called instead.
While I'm not a professional developer, I’d like to propose a potential implementation, which builds on the previous issue I raised (#96 ):
It would be very helpful if text responses could be streamed rather than requiring a full await. This feature would be especially beneficial for chat applications. For instance:
stream=True
in the configuration is sufficient.converse_stream()
function needs to be called instead.While I'm not a professional developer, I’d like to propose a potential implementation, which builds on the previous issue I raised (#96 ):
See: https://github.com/viictorjimenezzz/aisuite
Thank you for considering this request!
The text was updated successfully, but these errors were encountered: