You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've enabled Nano in Docker image and I'm getting an "EmbeddingFunc is not defined" error.
This is the same issue closed in #458 . My UI looks the same as in that bug.
I pulled the latest main-full image this morning and running locally against Ollama.
I've tried both Llama3.1 8b and Qwen2.5 14b for the model.
Reproduction steps
1. Run latest image with Nano enabled:
`docker run --platform linux/amd64 --net=host -e GRADIO_SERVER_NAME=0.0.0.0 -e GRADIO_SERVER_PORT=7860 -e USE_NANO_GRAPHRAG=true -e USE_CUSTOMIZED_GRAPHRAG_SETTING=false -p 7860:7860 -v ~/external-ssd/CodeReady/kotaemon/ktem_app_data:/app/ktem_app_data -it ghcr.io/cinnamon/kotaemon:main-full`
2. Setup Nomic Embed as default Embeddings model via Ollama, and set Qwen2.5 or Llama3.1 as default LLM via Ollama.
3. I don't change the Index Collections setup. Nano is set to 'embedding: default'3. Upload PDF File to NanoGraphRAG, and note UI error message on final step.
Screenshots
![DESCRIPTION](LINK.png)
Logs
Session reasoning type simple use mindmap False
Session LLM Qwen2_5
Reasoning class <class 'ktem.reasoning.simple.FullQAPipeline'>
Reasoning state {'app': {'regen': False}, 'pipeline': {}}
Thinking ...
Retrievers [DocumentRetrievalPipeline(DS=<kotaemon.storages.docstores.lancedb.LanceDBDocumentStore object at 0x7f69d1a51f00>, FSPath=PosixPath('/app/ktem_app_data/user_data/files/index_1'), Index=<class 'ktem.index.file.index.IndexTable'>, Source=<class 'ktem.index.file.index.Source'>, VS=<kotaemon.storages.vectorstores.chroma.ChromaVectorStore object at 0x7f69d1a528f0>, get_extra_table=False, llm_scorer=LLMTrulensScoring(concurrent=True, normalize=10, prompt_template=<kotaemon.llms.prompts.template.PromptTemplate object at 0x7f693565d840>, system_prompt_template=<kotaemon.llms.prompts.template.PromptTemplate object at 0x7f693565d9f0>, top_k=3, user_prompt_template=<kotaemon.llms.prompts.template.PromptTemplate object at 0x7f693565dc00>), mmr=False, rerankers=[CohereReranking(cohere_api_key='<COHERE_API_KEY>', model_name='rerank-multilingual-v2.0')], retrieval_mode='hybrid', top_k=10, user_id=1), GraphRAGRetrieverPipeline(DS=<theflow.base.unset_ object at 0x7f6ac85a85b0>, FSPath=<theflow.base.unset_ object at 0x7f6ac85a85b0>, Index=<class 'ktem.index.file.index.IndexTable'>, Source=<theflow.base.unset_ object at 0x7f6ac85a85b0>, VS=<theflow.base.unset_ object at 0x7f6ac85a85b0>, file_ids=[], user_id=<theflow.base.unset_ object at 0x7f6ac85a85b0>), NanoGraphRAGRetrieverPipeline(DS=<theflow.base.unset_ object at 0x7f6ac85a85b0>, FSPath=<theflow.base.unset_ object at 0x7f6ac85a85b0>, Index=<class 'ktem.index.file.index.IndexTable'>, Source=<theflow.base.unset_ object at 0x7f6ac85a85b0>, VS=<theflow.base.unset_ object at 0x7f6ac85a85b0>, file_ids=['b6d008a4-40b4-4d7e-80d6-c4e81bbd7862'], user_id=<theflow.base.unset_ object at 0x7f6ac85a85b0>)]
searching in doc_ids []
INFO:ktem.index.file.pipelines:Skip retrieval because of no selected files: DocumentRetrievalPipeline(
(vector_retrieval): <function Function._prepare_child.<locals>.exec at 0x7f69cb7e9990>
(embedding): <function Function._prepare_child.<locals>.exec at 0x7f69cb7e9cf0>
)
INFO:httpx:HTTP Request: POST http://localhost:11434/v1/embeddings "HTTP/1.1 200 OK"
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 575, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1923, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1520, in call_function
prediction = await utils.async_iteration(iterator)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 663, in async_iteration
return await iterator.__anext__()
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 656, in __anext__
return await anyio.to_thread.run_sync(
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 943, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 639, in run_sync_iterator_async
return next(iterator)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 801, in gen_wrapper
response = next(iterator)
File "/app/libs/ktem/ktem/pages/chat/__init__.py", line 835, in chat_fn
forresponsein pipeline.stream(chat_input, conversation_id, chat_history):
File "/app/libs/ktem/ktem/reasoning/simple.py", line 741, in stream
docs, infos = self.retrieve(message, history)
File "/app/libs/ktem/ktem/reasoning/simple.py", line 517, in retrieve
retriever_docs = retriever_node(text=query)
File "/usr/local/lib/python3.10/site-packages/theflow/base.py", line 1097, in __call__
raise e from None
File "/usr/local/lib/python3.10/site-packages/theflow/base.py", line 1088, in __call__
output = self.fl.exec(func, args, kwargs)
File "/usr/local/lib/python3.10/site-packages/theflow/backends/base.py", line 151, inexecreturn run(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/theflow/middleware.py", line 144, in __call__
raise e from None
File "/usr/local/lib/python3.10/site-packages/theflow/middleware.py", line 141, in __call__
_output = self.next_call(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/theflow/middleware.py", line 117, in __call__
return self.next_call(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/theflow/base.py", line 1017, in _runx
return self.run(*args, **kwargs)
File "/app/libs/ktem/ktem/index/file/graph/nano_pipelines.py", line 384, in run
graphrag_func, query_params = self._build_graph_search()
File "/app/libs/ktem/ktem/index/file/graph/nano_pipelines.py", line 316, in _build_graph_search
llm_func, embedding_func, _, _ = get_default_models_wrapper()
File "/app/libs/ktem/ktem/index/file/graph/nano_pipelines.py", line 101, in get_default_models_wrapper
embedding_func = EmbeddingFunc(
NameError: name 'EmbeddingFunc' is not defined
INFO:httpx:HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK"
User-id: 1, can see public conversations: True
Browsers
Firefox
OS
Linux
Additional information
No response
The text was updated successfully, but these errors were encountered:
Description
I've enabled Nano in Docker image and I'm getting an "EmbeddingFunc is not defined" error.
This is the same issue closed in #458 . My UI looks the same as in that bug.
I pulled the latest main-full image this morning and running locally against Ollama.
I've tried both Llama3.1 8b and Qwen2.5 14b for the model.
Reproduction steps
Screenshots
![DESCRIPTION](LINK.png)
Logs
Browsers
Firefox
OS
Linux
Additional information
No response
The text was updated successfully, but these errors were encountered: