-
Notifications
You must be signed in to change notification settings - Fork 465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] 部署s-lora报错ValueError: not enough values to unpack (expected 2, got 1) #1030
Comments
adapters input should be key-value pair. |
更新代码后,出现了新的错误,参数使用单个路径或键值对都得到了相同错误 |
Please provide a dummy adapter so I can do the debug. |
这是一个用LLaMA-Efficient-Tuning官方数据集,用Baichuan2-13B-Chat训练的自我认知adapter。加载时出现上述错误 |
Updated, should be right now. |
(aichat) root@autodl-container-9d254dac50-f7306f7d: |
are you using the latest branch? Model memory usage should be ~14.5G after commit 5ae19a1. |
Baichuan2-13B-Chat有26G。重新下载了分支代码,用两张4090运行成功了,占用46.5G +---------------------------------------------------------------------------------------+ |
但还有个问题,怎么配置多个adapter? |
from lmdeploy.messages import PytorchEngineConfig
from lmdeploy.pytorch.engine.engine import Engine
adapters = {'adapter0':'/path/to/adapter0', 'adapter1':'/path/to/adapter1'}
engine_config = PytorchEngineConfig(adapters=adapters)
engine = Engine.from_pretrained(model_path,
engine_config=engine_config,
trust_remote_code=True) generator = engine.create_instance()
for outputs in generator.stream_infer(session_id=session_id,
input_ids=input_ids,
gen_config=gen_config,
adapter_name='adapter0'):
# read outputs
# close session and release caches
generator.end(session_id) |
@grimoire restful API 支持配置多个 adapter 吗? |
comming soon. |
Checklist
Describe the bug
我使用项目https://github.com/edw008/LLaMA-Efficient-Tuning 训练了Baichuan2-13B-Chat模型,得到一个lora权重,在使用本项目进行部署时出现错误,请问我应该怎样部署?
我使用的命令是:lmdeploy chat torch --tp 2 --session-len 1024 --adapters ~/autodl-fs/lora/2024-01-09-11-53-38_all /root/autodl-tmp/baichuan-inc/Baichuan2-13B-Chat
Reproduction
lmdeploy chat torch --tp 2 --session-len 1024 --adapters ~/autodl-fs/lora/2024-01-09-11-53-38_all /root/autodl-tmp/baichuan-inc/Baichuan2-13B-Chat
Environment
- 操作系统:Ubuntu 20.04 - GPU:RTX 4090(24GB) * 3 - python:3.9.18 - pytorch:2.1.2
Error traceback
The text was updated successfully, but these errors were encountered: