Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPT4All v3.5.3 - A search in LocalDocs is carried out with no collections selected #3311

Closed
SINAPSA-IC opened this issue Dec 17, 2024 · 3 comments
Labels
bug-unconfirmed chat gpt4all-chat issues

Comments

@SINAPSA-IC
Copy link

SINAPSA-IC commented Dec 17, 2024

Bug Report

The search in previously-selected LocalDocs collections is still performed when no collectiins are selected anymore.

  1. This is proved not by the resulting reply, but by
  • the message ,"searching localdocs"
  • processor-memory occupancy
  1. The same happens at the first and second prompt after de-selecting the previously selected collections. It also happens on several subsequent prompts, until I couldn't be bothered to find out in which circumstances it would stop (check, uncheck, 1 collection or more, the order of these ops, that's the job of testers, if any that is.)

However, the replies refer knowledge that does not exist in those collections; even if this is somehow expected - the models will try and answer regardless of info from RAG, lacking a proper prompt (or System Prompt) syntax that would prevent it to do so - even so, is obvious that despite searching in those collections no info is retrieved from them.

This is an old problem, persisting since before v3.0.
It should have been solved by now, as it is related to critical functionality of the program, however despite users signaling it serms to have been put on the backburner and forgotten about, or ignored altogether - because after 4 5 updates it is still manifest.

Steps to Reproduce

  1. Select (check) one 1 LD collection
  2. ask the model something that exists in those LDs and it will reply with info from them
  3. de-select (uncheck)the collection
  4. ask the model anything
  5. "searching localdocs" appears, and CPU/memory get busy like when the collection was selected

Expected Behavior

The search in LocalDocs does not have to be carried out if no collections are selected.
pseudocode,
if (num_selected_ldcols==0)
goto search_only_internalknowledge

Your Environment

  • GPT4All version: v3.5.3
  • Operating System: Win10 b19042
  • Chat model used (if applicable): Nous Hermes 2 Mistral DPO, and others
@SINAPSA-IC SINAPSA-IC added bug-unconfirmed chat gpt4all-chat issues labels Dec 17, 2024
@SINAPSA-IC SINAPSA-IC changed the title GPR4all v3.5.3 - A search in LocalDocs is carried out with no collections selected GPT4All v3.5.3 - A search in LocalDocs is carried out with no collections selected Dec 17, 2024
@manyoso
Copy link
Collaborator

manyoso commented Dec 17, 2024

Can't reproduce:
Image

@SINAPSA-IC
Copy link
Author

SINAPSA-IC commented Dec 17, 2024

But can investigate, as 1) this was signaled way back in versions prior to 3.0 and now I'm seeing it in 3.5.3 and 2) relates to critical functionality. Something surely can be done to investigate, like, hooking the text on that label to a variabile and check if its text does indeed contain "searching localdocs..." when no collections are selected. And that^ video is moving so fast, I cannot discern anythinng, because there's that Light color scheme that burns the eyes along with small thin text for reduced contrast.
Perish the thought that the program moving that fast does not in fact leave the user enough time to actually read what's written on that label, even moreso when the files in the collection are only a few and all smallish. If it happens here, it surely is happening elsewhere, at other users. Ah yes, more importantly - this issue is old, has it been solved? Didn't see the Yes, it was known and has been solved, but then again, it must've escaped me.
As someone else here said in another Context, also aimed at improving the program but hey, that^ video amounts to gaslighting, but nevertheless the problem is really there. At least, out of curiosity to reproduce-investigate the issue, the video could be slowed down to, like, quarter speed, to be able to read those 2 words which Do appear as I explained.
And another thing - testing (if any at all) should be done on "consumer-grade hardware", not on industrial-grade rigs which are Not the target as the program is advertised.

@manyoso manyoso closed this as not planned Won't fix, can't repro, duplicate, stale Dec 18, 2024
@manyoso
Copy link
Collaborator

manyoso commented Dec 18, 2024

Can't reproduce as I said and issue creator is unable or unwilling to help constructively find a way to reproduce.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed chat gpt4all-chat issues
Projects
None yet
Development

No branches or pull requests

2 participants