Skip to content

Commit

Permalink
fix the issue of when using google Gemini pro, don't have chat histor… (
Browse files Browse the repository at this point in the history
#1743)

* fix the issue of when using google Gemini pro, don't have chat history record

just add chat_log in bridge_google_gmini.py

* Update bridge_google_gemini.py

---------

Co-authored-by: binary-husky <[email protected]>
  • Loading branch information
ycwfs and binary-husky committed Apr 25, 2024
1 parent cadaa81 commit 81df0aa
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion request_llms/bridge_google_gemini.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
import time
from request_llms.com_google import GoogleChatInit
from toolbox import ChatBotWithCookies
from toolbox import get_conf, update_ui, update_ui_lastest_msg, have_any_recent_upload_image_files, trimmed_format_exc
from toolbox import get_conf, update_ui, update_ui_lastest_msg, have_any_recent_upload_image_files, trimmed_format_exc, log_chat

proxies, TIMEOUT_SECONDS, MAX_RETRY = get_conf('proxies', 'TIMEOUT_SECONDS', 'MAX_RETRY')
timeout_bot_msg = '[Local Message] Request timeout. Network error. Please check proxy settings in config.py.' + \
Expand Down Expand Up @@ -99,6 +99,7 @@ def make_media_input(inputs, image_paths):
gpt_replying_buffer += paraphrase['text'] # 使用 json 解析库进行处理
chatbot[-1] = (inputs, gpt_replying_buffer)
history[-1] = gpt_replying_buffer
log_chat(llm_model=llm_kwargs["llm_model"], input_str=inputs, output_str=gpt_replying_buffer)
yield from update_ui(chatbot=chatbot, history=history)
if error_match:
history = history[-2] # 错误的不纳入对话
Expand Down

0 comments on commit 81df0aa

Please sign in to comment.