You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
运行结果
File "D:\PythonDemo\pdf_to_md_demo\pdf2markdown\pdf_to_markdown.py", line 264, in openai_test
resp = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_utils_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai\resources\chat\completions.py", line 829, in create
return self._post(
^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 1278, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 955, in request
return self._request(
^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 1059, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'detail': '[address=10.27.164.119:61057, pid=36672] Model not found, uid: qwen2-vl-7b-instruct-0'}
System Info / 系統信息
Cuda版本:12.1
Pytorch版本:2.5.1
操作系统:WIN10
python版本:3.9.11
transformers: 4.46.3
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
xinference 1.0.0
The command used to start Xinference / 用以启动 xinference 的命令
xinference-local --host 127.0.0.1 --port 9997
Reproduction / 复现过程
openai_base_url = "http://127.0.0.1:9997/v1"
client = OpenAI(
api_key="EMPTY",
base_url=openai_base_url)
4. page_image_path = "D:\PythonDemo\pdf_to_md_demo\pdf2markdown\output\0.png"
5.
messages2 = [{
"role": "user",
"content": [{
"type": "image_url",
"image_url": {
"url": f"data::image/png;base64,{encode_base64_content_from_local(page_image_path)}", #将图片转为二进制
},
{"type": "text", "text": "图片中的内容是什么?"}
}]
resp = client.chat.completions.create(
messages=messages2,
model="qwen2-vl-7b-instruct",
temperature=0.2,
)
print(resp)
File "D:\PythonDemo\pdf_to_md_demo\pdf2markdown\pdf_to_markdown.py", line 264, in openai_test
resp = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_utils_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai\resources\chat\completions.py", line 829, in create
return self._post(
^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 1278, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 955, in request
return self._request(
^^^^^^^^
File "D:\PythonDemo\pdf_to_md_demo\venv\Lib\site-packages\openai_base_client.py", line 1059, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'detail': '[address=10.27.164.119:61057, pid=36672] Model not found, uid: qwen2-vl-7b-instruct-0'}
Expected behavior / 期待表现
希望以上的错误能够得到解答,使用openai接口调用xinference时能够正常运行。
The text was updated successfully, but these errors were encountered: