-
-
Notifications
You must be signed in to change notification settings - Fork 522
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PanelCallbackHandler feature enhancements #5679
Comments
Duck Duck Go Agent not showing chain of thoughtMaybe the basic example above works as expected because it expected only to add the ChatMessage. But this agent does not show chain of thought either from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn
def callback(contents, user, instance):
llm.predict(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, streaming=True, callbacks=[callback_handler])
tools = load_tools(["ddg-search"], callbacks=[callback_handler])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, callbacks=[callback_handler]
)
instance.servable() Streamlit version from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st
llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)
if prompt := st.chat_input():
st.chat_message("user").write(prompt)
with st.chat_message("assistant"):
st_callback = StreamlitCallbackHandler(st.container())
response = agent.run(prompt, callbacks=[st_callback])
st.write(response) |
Maths Assistants not showing chain of thought eitherThe Math Assistant example in panel-chat-examples uses the Maybe it shows me something like chain of thought. But its really, really hard for me to know as the user experience is no where near that of the |
Examples not working because holoviz-topics/panel-chat-examples#67 isn't merged yet. |
Need clarification. It has all the methods listed on BaseCallbackHandler |
I don't understand 👍 . I'm reporting issues with the |
This is done to prevent import errors. This eventually will be migrated to Langchain I think. |
Thanks. I'm referring to the picture posted above. Maybe its because right now I don't see any chain of thought so it's hard for me to imagine how this should work. But in that picture lots of the methods don't do anything to the Panel |
Could you share a video and code of the |
There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation. import panel as pn
from langchain.llms import OpenAI
pn.extension()
def callback(contents, user, instance):
llm.predict(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler])
instance.servable() |
Thanks. What about the "duck duck go" and "maths assistant" examples I'm referring to? |
I assume there's no chain of thought there because it did not need to use the tool. Try import panel as pn
from langchain.agents import AgentType, load_tools, initialize_agent
from langchain.llms import OpenAI
pn.extension()
async def callback(contents, *args):
await agent.arun(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler], streaming=True)
tools = load_tools(["serpapi", "llm-math"], llm=llm, callbacks=[callback_handler])
agent = initialize_agent(
tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
callbacks=[callback_handler],
)
instance.servable() |
I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0. Please feel free to create PRs to improve the Langchain integration. |
What is the |
I believe it's a search tool. You could probably use duckduckgo |
How would you explain that the "duck duck go" example above #5679 (comment) shows shows chain of thought for Streamlit but not for Panel? |
Unfortunately, I don't have enough time to investigate, but would like to understand it better too! |
I can also get chain of thought with the Maths Assistant. But the example at |
For me the main issues are
|
I believe all the examples within the reference gallery + panel-chat-examples are working
Perhaps a link to the panel-chat-examples langchain directory
I imagine it'll be in Langchain so it can simply be PanelCallbackHandler, like Streamlit's is StreamlitCallbackHandler. |
To address the missing visual aids, I added links in #5681 |
Sounds good. I will give the reference notebook an iteration. |
I got the duck duck go example working to some extent by fixing the code from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn
llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True,
)
async def callback(contents, user, instance):
callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)
await agent.arun(contents, callbacks=[callback_handler])
pn.chat.ChatInterface(callback=callback).servable() But it does not show me the full search results. That is what Streamlit does? |
Hi. When making a retrieval augmented generation app in panel, where the LLM queries on a vector database created with some documents, using
Is there a way to not display 1 and 3? I just want to be able to stream the response from the LLM, and name the chatbot "Bot". Here is my code, if it is relevant. |
I'm running the latest
main
branch of Panel testing out thePanelCallbackHandler
.PanelCallBackhandler
leads to bad practice code. (To be solved by Improve PanelCallBackHandler basic #5682)PanelCallbackHandler
does not show the final response. Only the final output from the last tool. (To be solved by Improve PanelCallBackHandler basic #5682)PanelCallbackHandler
is for langchain. (To be solved by Improve PanelCallBackHandler basic #5682)PanelCallBackHandler
object
when LangChain not installed.PanelCallbackHandler
not clear.AsyncPanelCallbackHandler
.Basic Example not working
I would expect it to give me chain of thought similar to how Streamlit does it. But it does not show me anything that I could not see just using the
ChatInterface
.panel-callback-handler.mp4
Cannot configure
PanelCallbackhandler
I have no args for configuring the
PanelCallbackHandler
.Compare this to the
StreamlitCallbackHandler
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py.Besides the above Streamlit also output the chain of thought to stdout which is really helpful because it can help me create a log of what happens that I can analyze later. I would like that as an option to do too.
Lots of methods not implemented.
Looking at the
PanelCallBackHandler
I see so many of the methods not implemented. Instead of just calling thesuper().xyz
method. To me its a signal that only selected functionality is implemented?Compare this to https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py
Inherits
object
whenlangchain
not installed.Pyright complains that the
PanelCallbackHandler
inheritsobject
whenlangchain
not installed.Maybe we can live with this. But is theoretically a problem that we run lots of
super().xyz
methods that do not exist onobject
.PanelCallBackhandler
leads to bad practice code.The below code is the first example in the reference guide and illustrates some issues
callback
thellm
is not defined. This is back practice and could lead to issues.ChatInterface
instance and thePanelCallbackHandler
. This is quirky and hard to remember.Compare this to the Streamlit code
or the reference example
I like this Panel version better.
It also makes it easier for users to declare the
llm
once. For example in a seperate module. Or by using caching.Not clear that
PanelCallbackHandler
is for langchainAs long a the call back handler is a part of Panel code base I think its better to make LangChain more explicit by
langchain
as infrom panel.chat.langchain import PanelCallbackHandler
orLangChainCallbackHandler
. I.e.pn.chat.LangChainCallbackHandler
.Reference Notebook does not explain or show what to expect from the code.
The reference notebook only contains code. It does not contain any text or videos that can help the user understand how these examples should work. Thus leading to confusion also for me as I had other expectations.
Output from CallbackHandler note clear
What does the below mean? Is it really correctly formatted?
The text was updated successfully, but these errors were encountered: