You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently using the llama_index library to develop an intelligent chatbot project. However, I have encountered an issue with the aquery method of the ChromaVectorStore class.
When I call the aquery method, I encounter a TypeError: 'coroutine' object is not subscriptable. It seems that the aquery method is returning a coroutine instead of a direct result.
I have tried to resolve this issue by using the await keyword to wait for the coroutine to complete, but this did not solve the problem.
I am wondering if you could help me resolve this issue. Is the aquery method supposed to return a coroutine or a direct result? Am I doing something wrong in my code? : this main line of my code :
Erreur inattendue dans handle_query :'coroutine' object is not subscriptable
Traceback (most recent call last):
File "chatbot.py", line 394, in handle_query
response = await self.chat_engine.astream_chat(query)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 367, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\callbacks\utils.py", line 56, in async_wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\chat_engine\context.py", line 344, in astream_chat
nodes = await self._aget_nodes(message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\chat_engine\context.py", line 150, in _aget_nodes
nodes = await self._retriever.aretrieve(message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 367, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\base\base_retriever.py", line 276, in aretrieve
nodes = await self._aretrieve(query_bundle=query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IAI\myvenv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 367, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\core\indices\vector_store\retrievers\retriever.py", line 114, in _aretrieve
return await self._aget_nodes_with_embeddings(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\myvenv\Lib\site-packages\llama_index\core\indices\vector_store\retrievers\retriever.py", line 187, in _aget_nodes_with_embeddings
query_result = await self._vector_store.aquery(query, **self._kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IAI\myvenv\Lib\site-packages\llama_index\core\vector_stores\types.py", line 427, in aquery
return self.query(query, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\IAI\myvenv\Lib\site-packages\llama_index\vector_stores\chroma\base.py", line 371, in query
return self._query(
^^^^^^^^^^^^
File "C:\IA\myvenv\Lib\site-packages\llama_index\vector_stores\chroma\base.py", line 395, in _query
logger.debug(f"> Top {len(results['documents'][0])} nodes:")
The text was updated successfully, but these errors were encountered:
The issue you're encountering, where a 'coroutine' object is not subscriptable, suggests that there might be a mix-up between synchronous and asynchronous code execution. Specifically, it seems that the aquery method is expected to be asynchronous, but it might be incorrectly called or handled in a synchronous context.
Here are a few steps to help resolve this issue:
Ensure Asynchronous Handling: Make sure that all calls to asynchronous methods, like aquery, are awaited properly. The error might occur if a coroutine is being treated as a regular object without awaiting it.
Check Method Implementation: Verify that the aquery method in your ChromaVectorStore class is indeed asynchronous and that you are using await when calling it. The method should look like this:
Review the Call Stack: From the traceback, it seems that the aquery method is being called within another asynchronous method. Ensure that all methods in the call stack are correctly using await for asynchronous calls.
Check for Synchronous Calls: In your code, ensure that any synchronous methods are not inadvertently calling asynchronous methods without using await.
Library Version: Ensure that you are using the correct version of the llama_index library that supports the asynchronous features you are trying to use.
By following these steps, you should be able to resolve the issue with the aquery method returning a coroutine instead of a direct result. If the problem persists, double-check the implementation of the aquery method and how it is being called in your code.
@logan-markewich
Thank you for your response, I wanted to check if there might be another possible solution to this issue. If you have any alternative suggestions, I would love to hear them.
thanks
Bug Description
I am currently using the llama_index library to develop an intelligent chatbot project. However, I have encountered an issue with the aquery method of the ChromaVectorStore class.
When I call the aquery method, I encounter a TypeError: 'coroutine' object is not subscriptable. It seems that the aquery method is returning a coroutine instead of a direct result.
I have tried to resolve this issue by using the await keyword to wait for the coroutine to complete, but this did not solve the problem.
I am wondering if you could help me resolve this issue. Is the aquery method supposed to return a coroutine or a direct result? Am I doing something wrong in my code? : this main line of my code :
self.db = await chromadb.AsyncHttpClient(host="localhost", port=8000)
......
chroma_collection = await self.db.get_or_create_collection(collection_name)
vector_store = ChromaVectorStore(chroma_collection=chroma_collection)
def create_index():
return VectorStoreIndex.from_vector_store(
vector_store,
embed_model=self.embed_model,
callback_manager=self.callback_manager,
show_progress=True,
use_async=True
)
Exécuter dans un thread pour éviter les blocages
self.index_cache[collection_name] = await asyncio.to_thread(create_index)
index = self.index_cache[collection_name]
self.chat_engine = index.as_chat_engine(
chat_mode=self.chat_mode,
memory=self.memory_buffer,
verbose=True,
filters=filters,
)
response = await self.chat_engine.astream_chat(query)
Version
Version: 0.12.11
Steps to Reproduce
response = await self.chat_engine.astream_chat(query)
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: