Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

function call 模型能支持流式输出吗? #6063

Open
1 task done
SafeCool opened this issue Nov 18, 2024 · 0 comments
Open
1 task done

function call 模型能支持流式输出吗? #6063

SafeCool opened this issue Nov 18, 2024 · 0 comments
Labels
pending This problem is yet to be addressed

Comments

@SafeCool
Copy link

Reminder

  • I have read the README and searched the existing issues.

System Info

  • llamafactory version: 0.9.1.dev0
  • Platform: Linux-5.4.119-19-0009.11-x86_64-with-glibc2.35
  • Python version: 3.11.7
  • PyTorch version: 2.3.0+cu121 (GPU)
  • Transformers version: 4.41.2
  • Datasets version: 2.19.2
  • Accelerate version: 0.30.1
  • PEFT version: 0.11.1
  • TRL version: 0.8.6
  • GPU type: NVIDIA Graphics Device
  • DeepSpeed version: 0.14.0
  • vLLM version: 0.4.3

Reproduction

result = client.chat.completions.create(messages=messages, model="qwen", tools=tools, tool_choice="auto",stream=True )

Expected behavior

这种tools 并且开启stream=True就报错了,可以针对tools支持流式输出吗?遇到function就不流式输出,非function就流式输出。

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Nov 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pending This problem is yet to be addressed
Projects
None yet
Development

No branches or pull requests

1 participant