Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] An error occurred: <ContextVar name='context_contexts' at 0x17b9ea020> after upgrade to v1.0 #1578

Open
lorenzobalzani opened this issue Oct 21, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@lorenzobalzani
Copy link

lorenzobalzani commented Oct 21, 2024

Bug Description
An error occurred: <ContextVar name='context_contexts' at 0x14cd1fce0>

To Reproduce

    from trulens.apps.langchain import TruChain
    from langchain_community.vectorstores import Qdrant

    def invoke_conversational_llm(self, user_input, prompt_str, chat_history=None, rag_top_k=4, temperature=0,
                                  deployment_name=None, rag_score_threshold=0.5):
        """
        Invoke the conversational language model.
        """

        if not chat_history:
            chat_history = []

        try:
            retriever = Qdrant(...)
            llm = AzureChatOpenAI(...)

            def format_docs(docs):
                return "\n\n".join(doc.page_content for doc in docs)

            # This Runnable takes a dict with keys 'input' and 'context',
            # formats them into a prompt, and generates a response.
            rag_chain = (
                {
                    "input": lambda x: x["input"],
                    "context": lambda x: format_docs(x["context"]),  # context
                    "chat_history": lambda x: x["chat_history"],  # chat history
                }
                | prompt_str  # format query and context into prompt template
                | llm  # generate response
                | StrOutputParser()  # coerce to string
            ).with_config(run_name="rag_chain")

            input_args = {"input": user_input,
                          "chat_history": [(message.type, message.content) for interaction in chat_history
                                           for message in interaction]}

            retrieve_docs_chain = (lambda _: input_args["input"]) | retriever

            # Below, we chain `.assign` calls. This takes a dict and successively adds keys-- "context" and "answer"--
            # where the value for each key is determined by a Runnable. The Runnable operates on all keys in the dict.
            docs_chain = RunnablePassthrough.assign(context=retrieve_docs_chain)
            rag_chain = docs_chain | RunnablePassthrough.assign(answer=rag_chain)

            tru_chain = TruChain(rag_chain, app_name=self._collection_name, feedbacks=[])

            with tru_chain as recorder:
                ai_answer = rag_chain.invoke(input=input_args)  # --> error happens here

The error happens in tru_wrapper, instruments.py:633.

Expected behavior
No error after migration.

Relevant Logs/Tracebacks
No errors besides the main one.

Environment:

  • OS: macOS
  • Python Version: 3.12
  • TruLens version: 1.1.0
  • Versions of other relevant installed libraries: langchain==0.2.5

Additional context
Before migrating to the new version no error was happening.

@sahil-sharma-50
Copy link

I'm experiencing the same issue. Is there any update on this?

@sfc-gh-jreini
Copy link
Contributor

sfc-gh-jreini commented Oct 24, 2024

Hey folks - @piotrm0 is currently working on a fix for this. As far as we can tell this issue is related to running in particular environments such as Google Colab or Snowflake Notebooks that have issues with support for python Context Variables, and does not occur locally. @sahil-sharma-50 @lorenzobalzani Are you using either Google Colab or Snowflake notebooks when you run into this issue? Can you move to a local environment while we wait on a fix?

@lorenzobalzani
Copy link
Author

Hey folks - @piotrm0 is currently working on a fix for this. As far as we can tell this issue is related to running in particular environments such as Google Colab or Snowflake Notebooks that have issues with support for python Context Variables, and does not occur locally. @sahil-sharma-50 @lorenzobalzani Are you using either Google Colab or Snowflake notebooks when you run into this issue? Can you move to a local environment while we wait on a fix?

Hi, I’m currently running the package within a Streamlit application, but I’m not sure if it’s the root cause of the issue. However, we need to continue using Streamlit.

@sfc-gh-pmardziel
Copy link
Contributor

sfc-gh-pmardziel commented Oct 24, 2024

Hi @lorenzobalzani , we are working on a fix for next release. For now, can you add this to your code before you import the rest of the trulens library:

from trulens.core.feedback.endpoint import Endpoint
from trulens.core.instruments import WithInstrumentCallbacks
from contextvars import ContextVar
Endpoint._context_endpoints = ContextVar("endpoints", default={})
WithInstrumentCallbacks._context_contexts = ContextVar("context_contexts", default=set())
WithInstrumentCallbacks._stack_contexts = ContextVar("stack_contexts", default={})

@sfc-gh-pmardziel
Copy link
Contributor

I had to update the code listing above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants