Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conflict Between OpenAI and Spellcheck Add-ons When Sending Prompt in NVDA 2023.3 #15

Open
rayo-alcantar opened this issue Dec 3, 2023 · 3 comments

Comments

@rayo-alcantar
Copy link

I have encountered a compatibility issue between the OpenAI add-on (version 23.12.03) and the Spellcheck add-on (version 1.1) in NVDA (version 2023.3). The issue arises when attempting to send a prompt (using Control + Enter) while both add-ons are enabled.
Steps to Reproduce:

  1. Ensure both OpenAI and Spellcheck add-ons are enabled in NVDA.
  2. Try to send a prompt in OpenAI using Control + Enter.
  3. Observe that the process fails and an error is displayed, indicating missing arguments.
    Expected Behavior:
    The prompt should be sent and processed correctly without any errors, regardless of other enabled add-ons.
    Actual Behavior:
    When sending a prompt with OpenAI while Spellcheck is active, the process fails, and NVDA displays an error related to missing arguments.
    Temporary Solution:
    Deactivating the Spellcheck add-on, the OpenAI prompt submission works correctly. This suggests a conflict between the two add-ons.
    Additional Information:
    I am unable to provide an error trace as the issue prevents it from being captured. The problem is resolved only by deactivating Spellcheck, indicating a potential conflict between these two add-ons.
    Environment:
    • NVDA version: 2023.3
    • OpenAI Add-on version: 23.12.03
    • Spellcheck Add-on version: 1.1
@rperez030
Copy link
Contributor

The biggest issue here in my opinion is that, since the OpenAI library runs in the main thread, that error message freezes the entire screen reader until the user dismisses the popup. That assuming that the OpenAI library is the one reponsable for the popup message in the first place.

@jmdaweb
Copy link

jmdaweb commented Dec 4, 2023

In other words, a message box that is displayed outside the UI thread.
This issue is caused by add-ons bundled with different versions of the same dependencies as this add-on. OpenAI should ensure all imported modules are the ones bundled with the add-on. This is a quite complex task that may involve editing sys.path and removing things from sys.modules before importing.

@AAClause
Copy link
Owner

AAClause commented Dec 4, 2023

Thanks @rayo-alcantar for the report and @jmdaweb for the explanations.

For reference, here is the traceback:

Traceback (most recent call last):
  File "threading.pyc", line 926, in _bootstrap_inner
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\maindialog.py", line 112, in run
    response = client.chat.completions.create(**params)
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\_utils\_utils.py", line 301, in wrapper
    return func(*args, **kwargs)
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\resources\chat\completions.py", line 628, in create
    stream_cls=Stream[ChatCompletionChunk],
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\_base_client.py", line 1096, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\_base_client.py", line 861, in request
    remaining_retries=remaining_retries,
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\_base_client.py", line 876, in _request
    request = self._build_request(options)
  File "D:\app\nvda\2023.3\userConfig\addons\OpenAI\globalPlugins\openai\lib\openai\_base_client.py", line 481, in _build_request
    **kwargs,
TypeError: build_request() got an unexpected keyword argument 'timeout'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants