Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-explore use of agent-based chat engine #126

Open
rwood-97 opened this issue Nov 1, 2023 · 0 comments
Open

Re-explore use of agent-based chat engine #126

rwood-97 opened this issue Nov 1, 2023 · 0 comments

Comments

@rwood-97
Copy link
Contributor

rwood-97 commented Nov 1, 2023

At the moment we use the context chat engine which forces Reginald to always search through our database to find relevant information and answer based on that.
In llama-index, there is also a ReAct chat engine which takes your question and then uses an LLM to decide whether to search the database or just use existing 'knowledge' (i.e. data it was trained on). We think this decision is made based on the question only (i.e. nothing to with the contents of the database).
We tried the ReAct chat engine with llama2 but weren't super happy with its decision-making on whether to search/not search. See #64 .

We think it would be nice to re-explore the agent-based chat engines and/or using both database and prior knowledge. In particular, it would be good to have a set up of:

  1. query database,
  2. formulate answer,
  3. check if answer answers question and make decision of whehter to reply, query again or use knowledge,
  4. reformulate answer
  5. check if answer answers question,
  6. reply or say 'i dont know'

Potential options are:

  • Re-explore ReAct chat engine
  • Create our own chat engine and potentially PR to llama-index
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant