ComfyUI nodes utilizing LLM models on QianFan Platform (百度智能云千帆大模型平台)
- Clone into
custom_nodes
folder of ComfyUI - Install SDK(s) by running
python -m pip install -U -r requirements.txt
- Run ComfyUI first, and then edit
config.yaml
to fill in your own credentials
Document: 千帆AppBuilder-SDK
Implemented: DialogSummary, PlayGround
Inputs:
- model: selection of use model
- dialog: the dialog string to be summerized
Hint: only some of the models would response with pure JSON format that can be parsed, choose wisely
Inputs:
- model: selection of use model
- prompt_template: use in
str.format
to fill in params - params: yaml format, parse into dict and send to
str.format
Document: 千帆大模型平台-SDK
Implemented: Chat, Completion
Document: 千帆大模型平台-SDK
Inputs:
- model: selection of use model
- current message: yaml format, list of dicts,
role
andcontent
are required - endpoint: optional, only activate when model set to ENDPOINT
- history messages: optional, yaml format, like current message
actually will concatenate history messages and current message, then send to model
Document: 千帆大模型平台-SDK
Inputs:
- model: selection of use model
- prompt: prompt text
- endpoint: optional, only activate when model set to ENDPOINT