-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stream options are not available for azure openAI? #1469
Comments
@SAIVENKATARAJU Azure OpenAI doesn't support |
@kristapratico So there was no other option to get usage when streaming is enabled? |
Hi @kristapratico, can you share any estimations on when Azure OpenAI Rest API will start supporting the Workaround for us was to downgrade the |
@apmavrin I do not have an ETA, but I am sharing your feedback with the team.
Can you share an issue or a repro for this? I wasn't able to reproduce this with version 0.1.10 + AzureChatOpenAI. |
langchain-openai 0.1.10 chat_llm is of type ChatOpenAI and we instantiate it without supplying the
//invoking the stream function When you look into this file from the langchain-openai:
So, the request to AzureOpenAI now looks like this:
and the AzureOpenAI returns the 400 - Bad Request error due to the presence of the unknown parameter
If I comment out this opinionated line |
chat_llm=self.model_provider.get_chat_open_ai(
tracing_context=tracing_context,
model=model,
streaming=True,
) @apmavrin thanks. Is this custom code or does langchain provide this somewhere? I'm having trouble understanding how we end up with I think it's fair to only add |
According to the documentation of Azure OpenAI REST API there simply is no such option. However, the documentation of the OpenAI REST API lists such an option. We would also appreciate if Azure would provide this functionality of |
Hi @kristapratico, This is our code. We use the proxy to control which provider to forward the request. As of today, it's AzureOpenAI service. So all requests land on MS territory. Azure OpenAI REST API lacks support for the
yes, exactly. This was also our assumption. Unless it's specified, do not modify the request, Langchain ;) |
@apmavrin thanks for confirming the use case. I'm sharing the customer signal we're getting in this thread and working on trying to get an ETA I can share for when |
) - **Description:** This PR #22854 added the ability to pass `stream_options` through to the openai service to get token usage information in the response. Currently OpenAI supports this parameter, but Azure OpenAI does not yet. For users who proxy their calls to both services through ChatOpenAI, this breaks when targeting Azure OpenAI (see related discussion opened in openai-python: openai/openai-python#1469 (comment)). > Error code: 400 - {'error': {'code': None, 'message': 'Unrecognized request argument supplied: stream_options', 'param': None, 'type': 'invalid_request_error'}} This PR fixes the issue by only adding `stream_options` to the request if it's actually requested by the user (i.e. set to True). If I'm not mistaken, we have a test case that already covers this scenario: https://github.com/langchain-ai/langchain/blob/master/libs/partners/openai/tests/integration_tests/chat_models/test_base.py#L398-L399 - **Issue:** Issue opened in openai-python: openai/openai-python#1469 - **Dependencies:** N/A - **Twitter handle:** N/A --------- Co-authored-by: Chester Curme <[email protected]>
Hi all, Langchain released @kristapratico's change in We maintain I'm interested in knowing: why not use Thanks for your insight. |
Hi all, Following. I'm also interested in the Azure OpenAI REST API supporting Thanks in advance! |
This is a feature request for the Azure OpenAI API, not a bug report for this SDK, so I'm closing this issue. |
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
I was following this article on stream option.https://cookbook.openai.com/examples/how_to_stream_completions.
I am using:
"2023-05-15", "gpt-35-turbo"
code:
To Reproduce
I was following this article on stream option.https://cookbook.openai.com/examples/how_to_stream_completions.
I am using:
"2023-05-15", "gpt-35-turbo"
code:
Code snippets
No response
OS
Win
Python version
3.10
Library version
1.30.1
The text was updated successfully, but these errors were encountered: