Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🔥 0.3.0 改进项与预期新增功能 #4335

Open
imClumsyPanda opened this issue Jun 26, 2024 · 27 comments
Open

🔥 0.3.0 改进项与预期新增功能 #4335

imClumsyPanda opened this issue Jun 26, 2024 · 27 comments
Labels
enhancement New feature or request

Comments

@imClumsyPanda
Copy link
Collaborator

imClumsyPanda commented Jun 26, 2024

🐞 改进项

✅ 已发布

  • 修复依赖包不完整导致 chatchat-kb -r 命令执行过程中发生依赖库缺失的报错

🕑 已完成待发布

  • 配置项除命令行修改方式外,重新支持本地文件修改方式
  • 优化模型推理框架接入方式,减少所需配置内容,已支持 xinference 框架中除 audio 类模型外的其他类型模型中已启动模型的自动检测。
  • 支持在 windows 系统中 ctrl+c 中止 chatchat-kb -r 过程
  • 修复 max_token 不生效的问题

🏗️ 待开发完成

  • 修复 ollama 接入过程中可能发生报错的问题
  • 因 ollama 启动的模型名称中带有冒号导致无法完成知识库创建

💡 新增功能

✅ 已发布

🕑 已完成待发布

🏗️ 待开发完成

  • 增加 rag 测试页面,可针对搜索结果及问答效果进行测试和对比
  • 基于 lobe-chat 的前端页面
  • 支持多模态对话
  • 支持多模态 RAG
@imClumsyPanda imClumsyPanda added the enhancement New feature or request label Jun 26, 2024
@imClumsyPanda imClumsyPanda changed the title 0.3.0 改进项与预期新增功能 🔥 0.3.0 改进项与预期新增功能 Jun 26, 2024
@imClumsyPanda imClumsyPanda pinned this issue Jun 26, 2024
@yuehua-s
Copy link
Collaborator

“max_token 不生效的问题”已修复

@630bdd
Copy link

630bdd commented Jun 29, 2024

多模态对话和rag会在0.3.0发布吗

@imClumsyPanda
Copy link
Collaborator Author

多模态对话和rag会在0.3.0发布吗

会。目前预期是一个版本修复bug,一个版本新增功能。

@kogwang
Copy link

kogwang commented Jun 30, 2024

0.3.0会支持并发吗?

@Qi0716
Copy link

Qi0716 commented Jul 3, 2024

在ollama框架中启动langchain0.3版本进行知识库初始化还是报错chatchat-kb -r

(langchain) D:\other\Langchain-Chatchat-master>chatchat-kb -r
recreating all vector stores
C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
2024-07-03 10:22:16,321 - utils.py[line:260] - ERROR: failed to create Embeddings for model: bge-large-zh-v1.5.
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\utils.py", line 258, in get_Embeddings
return LocalAIEmbeddings(**params)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\pydantic\v1\main.py", line 341, in init
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)
2024-07-03 10:22:16,322 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents'
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
2024-07-03 10:22:16,323 - init_database.py[line:150] - ERROR: 向量库 samples 加载失败。
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\init_database.py", line 129, in main
folder2db(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 152, in folder2db
kb.create_kb()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb
self.do_create_kb()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb
self.load_vector_store()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store
return kb_faiss_pool.load_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store
raise RuntimeError(f"向量库 {kb_name} 加载失败。")
RuntimeError: 向量库 samples 加载失败。
2024-07-03 10:22:16,325 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...

1 similar comment
@Qi0716
Copy link

Qi0716 commented Jul 3, 2024

在ollama框架中启动langchain0.3版本进行知识库初始化还是报错chatchat-kb -r

(langchain) D:\other\Langchain-Chatchat-master>chatchat-kb -r
recreating all vector stores
C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
2024-07-03 10:22:16,321 - utils.py[line:260] - ERROR: failed to create Embeddings for model: bge-large-zh-v1.5.
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\utils.py", line 258, in get_Embeddings
return LocalAIEmbeddings(**params)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\pydantic\v1\main.py", line 341, in init
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)
2024-07-03 10:22:16,322 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents'
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'
2024-07-03 10:22:16,323 - init_database.py[line:150] - ERROR: 向量库 samples 加载失败。
Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
AttributeError: 'NoneType' object has no attribute 'embed_documents'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\init_database.py", line 129, in main
folder2db(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 152, in folder2db
kb.create_kb()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb
self.do_create_kb()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb
self.load_vector_store()
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store
return kb_faiss_pool.load_vector_store(
File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store
raise RuntimeError(f"向量库 {kb_name} 加载失败。")
RuntimeError: 向量库 samples 加载失败。
2024-07-03 10:22:16,325 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...

@mrkingsun
Copy link

建议前端优化,参考主流的设计,比如“会话”应占据整个左侧边栏;知识库管理、模型配置等集成到左下角或右上角的设置界面。

@yuehua-s
Copy link
Collaborator

yuehua-s commented Jul 4, 2024

新增text2promql功能

@Nancy7zt
Copy link

Nancy7zt commented Jul 5, 2024

0.3.0会支持高并发吗

@xldistance
Copy link
Contributor

会支持微软的graphrag吗

@anderson-gznu
Copy link

0.3.0会支持并发吗?0.2.0对并发支持较差

@HaKLMTT
Copy link

HaKLMTT commented Jul 18, 2024

没有GPU,用cpu可以跑吗?

@imClumsyPanda
Copy link
Collaborator Author

@HaKLMTT 可以,用ollama

@HaKLMTT
Copy link

HaKLMTT commented Jul 19, 2024

@HaKLMTT 可以,用ollama

好的,感谢回复。请问支持国产ARM吗?

@1490113799
Copy link

0.3版本可以支持配置在线模型吗,现在看虽然可以用oneapi配置在线大模型,但是无法实现在线embedding模型的加载,同时0.3版本也移除了0.2版本的本地化embedding模型功能

@kid-297
Copy link

kid-297 commented Aug 9, 2024

会支持信创环境么

@lizhenkai5008
Copy link

ollama什么时间可以支持

@imClumsyPanda
Copy link
Collaborator Author

@lizhenkai5008 已经支持了

@ForgetThatNight
Copy link

支持self-rag或者agentic RAG吗?比如抗战胜利的那一年罗斯福做了什么?这种多跳推理的RAG问题

@ClementeGao
Copy link

支持语音识别吗

@ClementeGao
Copy link

另外 微信二维码过期了,帮忙更新以下 @imClumsyPanda

@imClumsyPanda
Copy link
Collaborator Author

@ClementeGao 二维码已更新

@perfece
Copy link

perfece commented Sep 27, 2024

把agent做好,能解决很多问题,对智能化提升最明显。但目前agent问答存在以下问题:

  1. agent问答响应流程较慢,从问到回答30秒左右了,一般用户不能接受这个时间;
  2. 基于不同模型需要定制agent处理链路,比如项目中的chatglm3的agent调试到生成环境后,升级glm4,又要重新写一个

感谢chatchat项目组提供这么好的开源项目,上述agent问题,不知道大佬们有没好的建议

@snowpalm
Copy link

项目近几个月无重大更新,想进一步了解项目最新发展趋势,微信二维码过期了,请帮忙更新, 感谢。 @imClumsyPanda

@liudong995
Copy link

建议RAG对话过程中,检索答案前加入AI问题补全,来保证上下文的对话中的内容检索不准确的问题

@sun8134
Copy link

sun8134 commented Nov 18, 2024

@ClementeGao 二维码已更新

二维码又过期了

@imClumsyPanda
Copy link
Collaborator Author

@ClementeGao 二维码已更新

二维码又过期了

刚测试是还能用

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests