Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

httpx.ConnectError: [Errno 111] Connection refused #6119

Open
1 task done
cuiMidAutumn opened this issue Nov 23, 2024 · 0 comments
Open
1 task done

httpx.ConnectError: [Errno 111] Connection refused #6119

cuiMidAutumn opened this issue Nov 23, 2024 · 0 comments
Labels
pending This problem is yet to be addressed

Comments

@cuiMidAutumn
Copy link

Reminder

  • I have read the README and searched the existing issues.

System Info

执行 llamafactory-cli webui 总是报错如下,请问是怎么回事,如何解决呢?

/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/transformers/utils/hub.py:128: FutureWarning: Using TRANSFORMERS_CACHE is deprecated and will be removed in v5 of Transformers. Use HF_HOME instead.
warnings.warn(
Running on local URL: http://0.0.0.0:7860
Traceback (most recent call last):
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions
yield
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 236, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
raise exc from None
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
raise exc
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
with map_exceptions(exc_map):
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/contextlib.py", line 155, in exit
self.gen.throw(value)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/miniconda3/envs/Llama_factory/bin/llamafactory-cli", line 8, in
sys.exit(main())
^^^^^^
File "/data/chb/LLaMA-Factory/src/llamafactory/cli.py", line 115, in main
run_web_ui()
File "/data/chb/LLaMA-Factory/src/llamafactory/webui/interface.py", line 91, in run_web_ui
create_ui().queue().launch(share=gradio_share, server_name=server_name, inbrowser=True)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/gradio/blocks.py", line 2408, in launch
httpx.get(
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_api.py", line 210, in get
return request(
^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_api.py", line 118, in request
return client.request(
^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 837, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 926, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 954, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 991, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 1027, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 235, in handle_request
with map_httpcore_exceptions():
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/contextlib.py", line 155, in exit
self.gen.throw(value)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 111] Connection refused

Reproduction

执行 llamafactory-cli webui 或者 CUDA_VISIBLE_DEVICES=0 llamafactory-cli webchat
--model_name_or_path /media/codingma/LLM/llama3/Meta-Llama-3-8B-Instruct
--template llama3 指令,总是报错如下,请问 如何解决呢?

/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/transformers/utils/hub.py:128: FutureWarning: Using TRANSFORMERS_CACHE is deprecated and will be removed in v5 of Transformers. Use HF_HOME instead.
warnings.warn(
Running on local URL: http://0.0.0.0:7860
Traceback (most recent call last):
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions
yield
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 236, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
raise exc from None
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
raise exc
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
with map_exceptions(exc_map):
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/contextlib.py", line 155, in exit
self.gen.throw(value)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/miniconda3/envs/Llama_factory/bin/llamafactory-cli", line 8, in
sys.exit(main())
^^^^^^
File "/data/chb/LLaMA-Factory/src/llamafactory/cli.py", line 115, in main
run_web_ui()
File "/data/chb/LLaMA-Factory/src/llamafactory/webui/interface.py", line 91, in run_web_ui
create_ui().queue().launch(share=gradio_share, server_name=server_name, inbrowser=True)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/gradio/blocks.py", line 2408, in launch
httpx.get(
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_api.py", line 210, in get
return request(
^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_api.py", line 118, in request
return client.request(
^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 837, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 926, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 954, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 991, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_client.py", line 1027, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 235, in handle_request
with map_httpcore_exceptions():
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/contextlib.py", line 155, in exit
self.gen.throw(value)
File "/usr/local/miniconda3/envs/Llama_factory/lib/python3.12/site-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 111] Connection refused

Expected behavior

No response

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Nov 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pending This problem is yet to be addressed
Projects
None yet
Development

No branches or pull requests

1 participant