Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PanelCallbackHandler feature enhancements #5679

Open
8 of 13 tasks
MarcSkovMadsen opened this issue Oct 18, 2023 · 23 comments
Open
8 of 13 tasks

PanelCallbackHandler feature enhancements #5679

MarcSkovMadsen opened this issue Oct 18, 2023 · 23 comments
Labels
type: bug Something isn't correct or isn't working

Comments

@MarcSkovMadsen
Copy link
Collaborator

MarcSkovMadsen commented Oct 18, 2023

I'm running the latest main branch of Panel testing out the PanelCallbackHandler.

Basic Example not working

I would expect it to give me chain of thought similar to how Streamlit does it. But it does not show me anything that I could not see just using the ChatInterface.

image

panel-callback-handler.mp4
import panel as pn
from langchain.llms import OpenAI

pn.extension()


def callback(contents, user, instance):
    llm.predict(contents)


instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Cannot configure PanelCallbackhandler

I have no args for configuring the PanelCallbackHandler.

image

Compare this to the StreamlitCallbackHandler https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py.

image

Besides the above Streamlit also output the chain of thought to stdout which is really helpful because it can help me create a log of what happens that I can analyze later. I would like that as an option to do too.

Lots of methods not implemented.

Looking at the PanelCallBackHandler I see so many of the methods not implemented. Instead of just calling the super().xyz method. To me its a signal that only selected functionality is implemented?

image

Compare this to https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py

Inherits object when langchain not installed.

Pyright complains that the PanelCallbackHandler inherits object when langchain not installed.

Maybe we can live with this. But is theoretically a problem that we run lots of super().xyz methods that do not exist on object.

image

PanelCallBackhandler leads to bad practice code.

The below code is the first example in the reference guide and illustrates some issues

  • When we define the callback the llm is not defined. This is back practice and could lead to issues.
  • We have to define both the ChatInterface instance and the PanelCallbackHandler. This is quirky and hard to remember.
import panel as pn
from langchain.llms import OpenAI

pn.extension()


def callback(contents, user, instance):
    llm.predict(contents)


instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Compare this to the Streamlit code

from langchain.callbacks import StreamlitCallbackHandler
from langchain.llms import OpenAI
from langchain.llms.openai import OpenAI
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = llm.predict(prompt, callbacks=[st_callback])
        st.write(response)

or the reference example

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = agent.run(prompt, callbacks=[st_callback])
        st.write(response)

I like this Panel version better.

import panel as pn
from langchain.llms import OpenAI

pn.extension()

llm = OpenAI(temperature=0)

def callback(contents, user, instance):
    callback_handler = pn.chat.PanelCallbackHandler(instance)
    llm.predict(contents, callbacks=[callback_handler])


instance = pn.chat.ChatInterface(callback=callback)
instance.servable()

It also makes it easier for users to declare the llm once. For example in a seperate module. Or by using caching.

Not clear that PanelCallbackHandler is for langchain

As long a the call back handler is a part of Panel code base I think its better to make LangChain more explicit by

  • either importing from langchain as in from panel.chat.langchain import PanelCallbackHandler or
  • rename to LangChainCallbackHandler. I.e. pn.chat.LangChainCallbackHandler.

Reference Notebook does not explain or show what to expect from the code.

The reference notebook only contains code. It does not contain any text or videos that can help the user understand how these examples should work. Thus leading to confusion also for me as I had other expectations.

Output from CallbackHandler note clear

What does the below mean? Is it really correctly formatted?

image

@MarcSkovMadsen MarcSkovMadsen added the type: bug Something isn't correct or isn't working label Oct 18, 2023
@MarcSkovMadsen
Copy link
Collaborator Author

Duck Duck Go Agent not showing chain of thought

Maybe the basic example above works as expected because it expected only to add the ChatMessage. But this agent does not show chain of thought either

image

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, streaming=True, callbacks=[callback_handler])
tools = load_tools(["ddg-search"], callbacks=[callback_handler])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, callbacks=[callback_handler]
)

instance.servable()

Streamlit version

image

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = agent.run(prompt, callbacks=[st_callback])
        st.write(response)

@MarcSkovMadsen
Copy link
Collaborator Author

Maths Assistants not showing chain of thought either

The Math Assistant example in panel-chat-examples uses the PanelCallbackHandler.

Maybe it shows me something like chain of thought. But its really, really hard for me to know as the user experience is no where near that of the StreamlitCallbackHandler.

image

@MarcSkovMadsen MarcSkovMadsen changed the title PanelCallbackHandler not working as expected PanelCallbackHandler not working Oct 18, 2023
@MarcSkovMadsen MarcSkovMadsen added this to the v1.3.0 milestone Oct 18, 2023
@ahuang11
Copy link
Contributor

Examples not working because holoviz-topics/panel-chat-examples#67 isn't merged yet.

@ahuang11
Copy link
Contributor

Lots of methods not implemented

Need clarification. It has all the methods listed on BaseCallbackHandler
https://python.langchain.com/docs/modules/callbacks/

@MarcSkovMadsen
Copy link
Collaborator Author

MarcSkovMadsen commented Oct 18, 2023

Examples not working because holoviz-topics/panel-chat-examples#67 isn't merged yet.

I don't understand 👍 . I'm reporting issues with the main branch of Panel primarely?

@ahuang11
Copy link
Contributor

Inherits object when langchain not installed.

This is done to prevent import errors. This eventually will be migrated to Langchain I think.

@MarcSkovMadsen
Copy link
Collaborator Author

Lots of methods not implemented

Need clarification. It has all the methods listed on BaseCallbackHandler https://python.langchain.com/docs/modules/callbacks/

Thanks. I'm referring to the picture posted above. Maybe its because right now I don't see any chain of thought so it's hard for me to imagine how this should work.

But in that picture lots of the methods don't do anything to the Panel ChatInterface. They just call super().

@MarcSkovMadsen
Copy link
Collaborator Author

Could you share a video and code of the PanelCallbackHandler working @ahuang11? That would help me a lot to understand how you would expect this to work. Thanks.

@ahuang11
Copy link
Contributor

There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.

import panel as pn
from langchain.llms import OpenAI

pn.extension()


def callback(contents, user, instance):
    llm.predict(contents)


instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

@MarcSkovMadsen
Copy link
Collaborator Author

There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.

import panel as pn
from langchain.llms import OpenAI

pn.extension()


def callback(contents, user, instance):
    llm.predict(contents)


instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Thanks. What about the "duck duck go" and "maths assistant" examples I'm referring to?

@ahuang11
Copy link
Contributor

ahuang11 commented Oct 18, 2023

I assume there's no chain of thought there because it did not need to use the tool.

Try
https://github.com/holoviz/panel/blob/main/examples/reference/chat/PanelCallbackHandler.ipynb

import panel as pn
from langchain.agents import AgentType, load_tools, initialize_agent
from langchain.llms import OpenAI

pn.extension()


async def callback(contents, *args):
    await agent.arun(contents)


instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler], streaming=True)
tools = load_tools(["serpapi", "llm-math"], llm=llm, callbacks=[callback_handler])
agent = initialize_agent(
    tools,
    llm,
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    callbacks=[callback_handler],
)

instance.servable()
image

@MarcSkovMadsen MarcSkovMadsen mentioned this issue Oct 18, 2023
37 tasks
@ahuang11
Copy link
Contributor

I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.

Please feel free to create PRs to improve the Langchain integration.

@ahuang11 ahuang11 removed this from the v1.3.0 milestone Oct 18, 2023
@ahuang11 ahuang11 changed the title PanelCallbackHandler not working PanelCallbackHandler feature enhancements Oct 18, 2023
@MarcSkovMadsen
Copy link
Collaborator Author

What is the serpapi for? I don't have a key. Is it nescessary?

@ahuang11
Copy link
Contributor

I believe it's a search tool. You could probably use duckduckgo

@MarcSkovMadsen
Copy link
Collaborator Author

How would you explain that the "duck duck go" example above #5679 (comment) shows shows chain of thought for Streamlit but not for Panel?

@ahuang11
Copy link
Contributor

Unfortunately, I don't have enough time to investigate, but would like to understand it better too!

@MarcSkovMadsen
Copy link
Collaborator Author

MarcSkovMadsen commented Oct 18, 2023

I can also get chain of thought with the Maths Assistant.

image

But the example at panel-chat-examples has issues holoviz-topics/panel-chat-examples#68.

@MarcSkovMadsen
Copy link
Collaborator Author

MarcSkovMadsen commented Oct 18, 2023

I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.

Please feel free to create PRs to improve the Langchain integration.

For me the main issues are

  • Confidence in the PanelCallbackHandler.
    • Some working examples will help
    • Some videos of the examples working in the Reference Notebook will help
  • Reference notebook does not contain any visualizations making it hard for users (including) me to understand what to expect from this.
  • Naming. I would recommend pn.chat.langchain.PanelCallBackhandler or alternatively pn.chat.LangChainCallbackHandler to make it clear the PanelCallbackHandler is for LangChain. This is very hard to change later.

@ahuang11
Copy link
Contributor

Some working examples will help

I believe all the examples within the reference gallery + panel-chat-examples are working

Some videos of the examples working in the Reference Notebook will help

Perhaps a link to the panel-chat-examples langchain directory

This is very hard to change later.

I imagine it'll be in Langchain so it can simply be PanelCallbackHandler, like Streamlit's is StreamlitCallbackHandler.
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py#L225

@ahuang11
Copy link
Contributor

To address the missing visual aids, I added links in #5681

@MarcSkovMadsen
Copy link
Collaborator Author

Sounds good. I will give the reference notebook an iteration.

@MarcSkovMadsen
Copy link
Collaborator Author

I got the duck duck go example working to some extent by fixing the code

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, 
)

async def callback(contents, user, instance):
    callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)
    await agent.arun(contents, callbacks=[callback_handler])

pn.chat.ChatInterface(callback=callback).servable()

panel-callback-handler-agent

But it does not show me the full search results. That is what Streamlit does?

image

@PrashantSaikia
Copy link

PrashantSaikia commented Mar 2, 2024

Hi. When making a retrieval augmented generation app in panel, where the LLM queries on a vector database created with some documents, using PanelCallbackHandler displays the whole chain of thought process, which includes:

  1. Showing the relevant documents fetched, by "LangChain (retriever)"
  2. Streaming the response from the LLM, by "LangChain (gpt-4-1106-preview)"
  3. And once the streaming is finished, the same response is copied and displayed again, by "Assistant".

Is there a way to not display 1 and 3? I just want to be able to stream the response from the LLM, and name the chatbot "Bot".

Here is my code, if it is relevant.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't correct or isn't working
Projects
None yet
Development

No branches or pull requests

3 participants