Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server can not up?who can help me ? #2414

Open
ferry1376122 opened this issue Mar 7, 2025 · 1 comment
Open

Server can not up?who can help me ? #2414

ferry1376122 opened this issue Mar 7, 2025 · 1 comment

Comments

@ferry1376122
Copy link

2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.dag.loader[25888] INFO Importing D:\python\DB-GPT\examples/awel\simple_rag_summary_example.py
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.dag.loader[25888] ERROR Failed to import: D:\python\DB-GPT\examples/awel\simple_rag_summary_example.py, error message: Traceback (most recent call last):
File "D:\python\DB-GPT\packages\dbgpt-core\src\dbgpt\core\awel\dag\loader.py", line 91, in parse
loader.exec_module(new_module)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\python\DB-GPT\examples/awel\simple_rag_summary_example.py", line 65, in
llm_client=OpenAILLMClient(), language="en"
^^^^^^^^^^^^^^^^^
File "D:\python\DB-GPT\packages\dbgpt-core\src\dbgpt\model\proxy\llms\chatgpt.py", line 178, in init
_ = self.client.default_headers
^^^^^^^^^^^
File "D:\python\DB-GPT\packages\dbgpt-core\src\dbgpt\model\proxy\llms\chatgpt.py", line 229, in client
self._api_type, self._client = _build_openai_client(
^^^^^^^^^^^^^^^^^^^^^
File "D:\python\DB-GPT\packages\dbgpt-core\src\dbgpt\model\utils\chatgpt_utils.py", line 103, in _build_openai_client
openai_params, api_type, api_version, api_azure_deployment = _initialize_openai_v1(
^^^^^^^^^^^^^^^^^^^^^^
File "D:\python\DB-GPT\packages\dbgpt-core\src\dbgpt\model\utils\chatgpt_utils.py", line 90, in _initialize_openai_v1
raise ValueError("api_key is required, please set OPENAI_API_KEY environment")
ValueError: api_key is required, please set OPENAI_API_KEY environment

2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=b6370db5-0495-4bbb-b4d1-944fd793499e)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function at 0x000001F657933CE0>(AWEL_trigger_route__examples_data_analyst_copilot), endpoint: /examples/data_analyst/copilot, methods: ['POST']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/data_analyst/copilot
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=84c2db2b-2bc9-4889-ad78-ba1eb5d9b7d9)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function at 0x000001F657964A40>(AWEL_trigger_route__examples_simple_chat), endpoint: /examples/simple_chat, methods: ['POST']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/simple_chat
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=adde88a6-c173-48ed-8696-017c76d93f48)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function at 0x000001F657965300>(AWEL_trigger_route__examples_simple_history_multi_round_chat_completions), endpoint: /examples/simple_history/multi_round/chat/completions, methods: ['POST']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/simple_history/multi_round/chat/completions
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=ba9ae640-ddeb-4297-8d13-d78772576181)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function_get at 0x000001F657965A80>(AWEL_trigger_route__examples_hello), endpoint: /examples/hello, methods: ['GET']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/hello
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=aa85b430-a8f7-4543-add8-f30732fcaea5)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function at 0x000001F657966160>(AWEL_trigger_route__examples_simple_client_chat_completions), endpoint: /examples/simple_client/chat/completions, methods: ['POST']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/simple_client/chat/completions
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Register trigger HttpTrigger(node_id=45418f50-9fd4-469e-8e28-26263173aebb)
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO mount router function <function HttpTrigger._create_route_func..create_route_function..route_function at 0x000001F658990720>(AWEL_trigger_route__examples_simple_client_count_token), endpoint: /examples/simple_client/count_token, methods: ['POST']
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.http_trigger[25888] INFO Mount http trigger success, path: /api/v1/awel/trigger/examples/simple_client/count_token
2025-03-07 18:07:01 TSJ-014968 dbgpt.core.awel.trigger.trigger_manager[25888] INFO Include router <fastapi.routing.APIRouter object at 0x000001F6519DA5D0> to prefix path /api/v1/awel/trigger
Libro Server start!
start libro exception![WinError 2] 系统找不到指定的文件。
2025-03-07 18:07:01 TSJ-014968 dbgpt.model.cluster.worker.manager[25888] INFO Begin start all worker, apply_req: None
2025-03-07 18:07:01 TSJ-014968 dbgpt.model.cluster.worker.manager[25888] INFO Apply req: None, apply_func: <function LocalWorkerManager._start_all_worker.._start_worker at 0x000001F658993100>
2025-03-07 18:07:01 TSJ-014968 dbgpt.model.cluster.worker.manager[25888] INFO Apply to all workers
INFO: Application startup complete.
2025-03-07 18:07:01 TSJ-014968 dbgpt.model.cluster.worker.default_worker[25888] INFO Begin load model, model params:

=========================== SiliconFlowDeployModelParameters ===========================

name: Qwen/Qwen2.5-Coder-32B-Instruct
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

INFO: Uvicorn running on http://0.0.0.0:5670 (Press CTRL+C to quit)
2025-03-07 18:07:01 TSJ-014968 dbgpt.model.adapter.proxy_adapter[25888] INFO Load model from params:

=========================== SiliconFlowDeployModelParameters ===========================

name: Qwen/Qwen2.5-Coder-32B-Instruct
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

llm client class: <class 'dbgpt.model.proxy.llms.siliconflow.SiliconFlowLLMClient'>
2025-03-07 18:07:04 TSJ-014968 dbgpt.util.api_utils[25888] WARNING No healthy urls found, selecting randomly
INFO: 127.0.0.1:55282 - "POST /api/controller/models HTTP/1.1" 200 OK
2025-03-07 18:07:05 TSJ-014968 dbgpt.model.cluster.worker.default_worker[25888] INFO Begin load model, model params:

=========================== SiliconFlowDeployModelParameters ===========================

name: deepseek-ai/DeepSeek-R1
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

2025-03-07 18:07:05 TSJ-014968 dbgpt.model.adapter.proxy_adapter[25888] INFO Load model from params:

=========================== SiliconFlowDeployModelParameters ===========================

name: deepseek-ai/DeepSeek-R1
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

llm client class: <class 'dbgpt.model.proxy.llms.siliconflow.SiliconFlowLLMClient'>
INFO: 127.0.0.1:55284 - "POST /api/controller/models HTTP/1.1" 200 OK
2025-03-07 18:07:07 TSJ-014968 dbgpt.model.cluster.worker.default_worker[25888] INFO Begin load model, model params:

=========================== SiliconFlowDeployModelParameters ===========================

name: deepseek-ai/DeepSeek-V3
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

2025-03-07 18:07:07 TSJ-014968 dbgpt.model.adapter.proxy_adapter[25888] INFO Load model from params:

=========================== SiliconFlowDeployModelParameters ===========================

name: deepseek-ai/DeepSeek-V3
provider: proxy/siliconflow
verbose: False
concurrency: 100
backend: None
prompt_template: None
context_length: None
reasoning_model: None
api_base: https://api.siliconflow.cn/v1
api_key: sk******wx
api_type: None
api_version: None
http_proxy: None

======================================================================

llm client class: <class 'dbgpt.model.proxy.llms.siliconflow.SiliconFlowLLMClient'>
INFO: 127.0.0.1:55294 - "POST /api/controller/models HTTP/1.1" 200 OK
2025-03-07 18:07:09 TSJ-014968 dbgpt.model.cluster.worker.embedding_worker[25888] INFO Load embeddings model: BAAI/bge-large-zh-v1.5
2025-03-07 18:07:09 TSJ-014968 dbgpt.util.code.server[25888] INFO Code server is ready
INFO: 127.0.0.1:55297 - "POST /api/controller/models HTTP/1.1" 200 OK
2025-03-07 18:07:10 TSJ-014968 dbgpt.model.cluster.worker.embedding_worker[25888] INFO Load rerank embeddings model: BAAI/bge-reranker-v2-m3
INFO: 127.0.0.1:55303 - "POST /api/controller/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:55305 - "POST /api/controller/models HTTP/1.1" 200 OK
2025-03-07 18:07:11 TSJ-014968 dbgpt.model.cluster.worker.manager[25888] INFO There has model storage, start the model from storage
begin run _add_app_startup_event
2025-03-07 18:07:12 TSJ-014968 dbgpt_serve.datasource.manages.connect_config_db[25888] INFO Result: <sqlalchemy.engine.cursor.CursorResult object at 0x000001F658B4B770>
2025-03-07 18:07:14 TSJ-014968 dbgpt_serve.rag.connector[25888] INFO VectorStore:<class 'dbgpt_ext.storage.vector_store.chroma_store.ChromaStore'>
2025-03-07 18:07:15 TSJ-014968 dbgpt_serve.rag.connector[25888] INFO VectorStore:<class 'dbgpt_ext.storage.vector_store.chroma_store.ChromaStore'>
2025-03-07 18:07:15 TSJ-014968 dbgpt_ext.storage.vector_store.chroma_store[25888] INFO Check persist_dir: D:\python\DB-GPT\pilot/data\yintech_datagear_profile.vectordb
2025-03-07 18:07:16 TSJ-014968 dbgpt.storage.base[25888] INFO Loading 3220 chunks in 322 groups with 1 threads.
2025-03-07 18:07:16 TSJ-014968 dbgpt_ext.storage.vector_store.chroma_store[25888] INFO ChromaStore load document
2025-03-07 18:07:16 TSJ-014968 dbgpt.model.cluster.worker.embedding_worker[25888] INFO Receive embeddings request, model: BAAI/bge-large-zh-v1.5

---end!

win10 conda uv 7.0。
Installation was successful at 9 AM yesterday, and the system ran normally. However, the database connection could not read table indexes and comments, so I reinstalled the latest version today. The current issue is that apart from the sample error message, the only error seen is 'start libro exception![WinError 2] The system cannot find the specified file', and the service fails to start."

@vnicers
Copy link

vnicers commented Mar 9, 2025

It looks like the service has been successfully started,The error message "ValueError: api_key is required, please set OPENAI_API_KEY environment" is a default example from the system, indicating that the environment variable was not found, and it does not impact the startup.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants