index.as_query_engine does not seem to be working with local LLM #9769
Replies: 2 comments 1 reply
-
🤖 Hello @candcconsulting! 👋 I'm Dosu, a friendly bot here to lend a hand with your LlamaIndex queries, debug your issues, and help you become a valuable contributor. While we're waiting for a human maintainer, I'm here to assist you in the meantime. I'm ready to roll up my digital sleeves and get to work! Just remember, there's no problem too hairy for us to tackle... or should I say, too woolly? 🦙 Based on the information you've provided, it seems like the issue might be related to the Here is a relevant code snippet from the def retrieve(self, query_bundle: QueryBundle) -> List[Node]:
...
def aretrieve(self, query_bundle: QueryBundle) -> List[Node]:
...
def _query(self, query_bundle: QueryBundle) -> List[Node]:
...
def _aquery(self, query_bundle: QueryBundle) -> List[Node]:
... Source: llama_index/query_engine/retriever_query_engine.py If you're passing an integer to the If you're already passing an instance of the SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Have the same problem. 'RetrieverQueryEngine' object is not callable, when following the advanced LlamaParse RAG Worklow. |
Beta Was this translation helpful? Give feedback.
-
Name: llama-index
Version: 0.9.23
I am using a local llama2
llama2-model-gguf-AGL-Q4.gguf
I am building a local index which works OK ...
However, the issue comes when I try to work with the index
The output from the type statement is
loading response : Query Engine is <class 'llama_index.query_engine.retriever_query_engine.RetrieverQueryEngine'> and has a query <class 'method'> handling request What is a question
However, I always get an error
error occurred: 'int' object is not callable
Everything looks OK
I do have this working outside of llamaindex using langchain, but the indexing "seems" to be much quicker with llamindex, but if NOT using OpenAI causes too much pain ... I will stop further investigation.
Is this a simple fix or is the pain of using local LLM too much effort ?
Appreciate any input
Beta Was this translation helpful? Give feedback.
All reactions