Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Vertex LLM does not Handle FunctionCall tools #16678

Open
stfines-clgx opened this issue Oct 24, 2024 · 11 comments · May be fixed by #16793
Open

[Bug]: Vertex LLM does not Handle FunctionCall tools #16678

stfines-clgx opened this issue Oct 24, 2024 · 11 comments · May be fixed by #16793
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@stfines-clgx
Copy link

stfines-clgx commented Oct 24, 2024

Bug Description

When using a FunctionTool, I receive the following error:
AttributeError: 'FunctionCall' object has no attribute '_pb'

It seems that this tool makes some assumptions about the input data types.

Version

0.11.7, llama-index-llms-vertex==0.3.7

Steps to Reproduce

Update the code in The multi-agent concierge example to use Vertex, then try and run.

Relevant Logs/Tracbacks

INFO:chromadb.telemetry.product.posthog:Anonymized telemetry enabled. See                     https://docs.trychroma.com/telemetry for more information.
ERROR:asyncio:Exception in callback Dispatcher.span.<locals>.wrapper.<locals>.handle_future_result(span_id='Workflow.run...-6419e73b17c5', bound_args=<BoundArgumen...bose': True})>, instance=<clgx.agent.e...t 0x3230e6a50>, context=<_contextvars...t 0x31fb73e00>)(<WorkflowHand...ibute '_pb'")>) at /Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py:273
handle: <Handle Dispatcher.span.<locals>.wrapper.<locals>.handle_future_result(span_id='Workflow.run...-6419e73b17c5', bound_args=<BoundArgumen...bose': True})>, instance=<clgx.agent.e...t 0x3230e6a50>, context=<_contextvars...t 0x31fb73e00>)(<WorkflowHand...ibute '_pb'")>) at /Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py:273>
Traceback (most recent call last):
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 283, in handle_future_result
    raise exception
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/Users/stfines/workspace/propertyPoc-LlamaIndex/clgx/agent/CoreSearchAgent.py", line 242, in main
    result = await handler
             ^^^^^^^^^^^^^
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/futures.py", line 290, in __await__
    return self.result()  # May raise too.
           ^^^^^^^^^^^^^
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 376, in _run_workflow
    raise exception_raised
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 233, in _task
    raise e from None
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 229, in _task
    new_ev = await instrumented_step(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 357, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/workspace/propertyPoc-LlamaIndex/clgx/agent/events/CoreSearchEvents.py", line 358, in orchestrator
    tool_calls = llm.get_tool_calls_from_response(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/llms/vertex/base.py", line 498, in get_tool_calls_from_response
    response_dict = MessageToDict(tool_call._pb)
                                  ^^^^^^^^^^^^^
AttributeError: 'FunctionCall' object has no attribute '_pb'
Traceback (most recent call last):
  File "/Users/stfines/Applications/PyCharm Professional Edition.app/Contents/plugins/python/helpers-pro/pydevd_asyncio/pydevd_nest_asyncio.py", line 138, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/Applications/PyCharm Professional Edition.app/Contents/plugins/python/helpers-pro/pydevd_asyncio/pydevd_nest_asyncio.py", line 243, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/Users/stfines/workspace/propertyPoc-LlamaIndex/clgx/agent/CoreSearchAgent.py", line 242, in main
    result = await handler
             ^^^^^^^^^^^^^
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/futures.py", line 290, in __await__
    return self.result()  # May raise too.
           ^^^^^^^^^^^^^
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 376, in _run_workflow
    raise exception_raised
  File "/Users/stfines/.pyenv/versions/3.11.6/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 233, in _task
    raise e from None
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/workflow/workflow.py", line 229, in _task
    new_ev = await instrumented_step(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 357, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/workspace/propertyPoc-LlamaIndex/clgx/agent/events/CoreSearchEvents.py", line 358, in orchestrator
    tool_calls = llm.get_tool_calls_from_response(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/stfines/Library/Caches/pypoetry/virtualenvs/propertypoc-llamaindex-CWmV6wNU-py3.11/lib/python3.11/site-packages/llama_index/llms/vertex/base.py", line 498, in get_tool_calls_from_response
    response_dict = MessageToDict(tool_call._pb)
                                  ^^^^^^^^^^^^^
AttributeError: 'FunctionCall' object has no attribute '_pb'
@stfines-clgx stfines-clgx added bug Something isn't working triage Issue needs to be triaged/prioritized labels Oct 24, 2024
Copy link

dosubot bot commented Oct 24, 2024

Hey @stfines-clgx! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together!

The error you're encountering, AttributeError: 'FunctionCall' object has no attribute '_pb', suggests that the FunctionCall object is being used in a way that expects it to have a _pb attribute, which it does not. This issue arises in the context of using the get_tool_calls_from_response method in the LlamaIndex framework, specifically with the Vertex LLM integration.

To address this issue, you should check the following:

  1. Verify the FunctionCall Object: Ensure that the FunctionCall object is being correctly instantiated and used. It might be necessary to review the implementation of FunctionCall to understand its attributes and methods.

  2. Check for Updates or Patches: Since this might be a compatibility issue between the versions of LlamaIndex and llama-index-llms-vertex you are using, check if there are any updates or patches available that address this specific problem.

  3. Review the get_tool_calls_from_response Method: This method is designed to extract tool call information from a ChatResponse object. Ensure that the response object being passed to this method is correctly structured and contains the expected data.

  4. Alternative Approaches: If the _pb attribute is expected for serialization or conversion purposes, consider using alternative methods or attributes that are available on the FunctionCall object.

If these steps do not resolve the issue, further investigation into the specific implementation details of the FunctionCall class and its interaction with the Vertex LLM might be necessary.

To continue talking to Dosu, mention @dosu.

@logan-markewich
Copy link
Collaborator

Probably could be related to the package version for the generative-ai package?

I don't have access to vertex to test, but the code is here

Happy to accept a PR to fix

@stfines-clgx
Copy link
Author

That does seem possible, esp since the pyproject.toml lists a aiplatform dependency of 1.39.0 and the current version is 1.70.0; I'll pull locally and see if I can build this lib with a newer dependency

@logan-markewich
Copy link
Collaborator

That would be amazing, thank you!

@stfines-clgx
Copy link
Author

Taking a look at this, the entire set of interfaces has changed significantly. I've forked and will re-work how to interact with gemini so that it uses the actual released interface and as such doesn't depend on unreleased/private features. which should address the problem.

Once I can verify that it functions I'll submit a PR - it'll probably require a new version update.

@stfines-clgx
Copy link
Author

Ok this fork has a branch - dev/1_70_protobuf_only that seems to resolve this issue. My testing with several different types of functions and non-function invocations has it working well. It does not resolve
#16625 - it looks like that will take action from the google-cloud-aiplatform team though I will continue to explore mitigations.

@logan-markewich if you would take a look prior to my creating a PR and let me know what else needs doing to make a good PR for this project I would appreciate it.

@noabenefraim
Copy link

I encountered the same bug. However this fix did not resolve my issue. I will investigate further and see what insights I can offer.

Thank you for investigating this @stfines-clgx, appreciate it.

@stfines-clgx
Copy link
Author

stfines-clgx commented Nov 18, 2024

If you can post your error and input I can take a look and see what I need to add - I can't test all the different variations in input here so If I can make some unit tests It would help quite a bit.

Although, to be sure you actually use this version you have to pull the fork, then run
poetry build and it will then create an archive in the dist folder like 0.4.0.vai.fixes.tar.gz which you'll have to add to your package manager however you do that (I use poetry, and I do something like llama-index-llms-vertex = {path = "/Users/stfines/workspace/llama_index/llama-index-integrations/llms/llama-index-llms-vertex/dist/0.4.0.vai.fixes.tar.gz", develop=true })

@stfines-clgx
Copy link
Author

stfines-clgx commented Nov 18, 2024

@logan-markewich can you assign this, #16625, and #16970 to me? It will make it easier for me to keep track of comments on both

@noabenefraim
Copy link

noabenefraim commented Nov 18, 2024

@stfines-clgx

We confirmed we are running your version. Here is the error we are seeing - google.protobuf.json_format.ParseError: Failed to parse properties field: Failed to parse anyOf field: Failed to parse type field: Invalid enum value NULL for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[images].anyOf[1].type...

Before we merged your code, we were getting this error: AttributeError: 'FunctionCall' object has no attribute '_pb'

Going to add more context below:

This is the repo we are working off of - https://github.com/run-llama/multi-agent-concierge/tree/main
The only file I changed is this repository is main.py. I have attached it as a .txt because github won't let me upload a .py file.
I've also attached my current environment dependencies.

main.txt
req_llama_fork.txt

Thank you for your help, and let me know if you need more info.

@noabenefraim
Copy link

Here's another simple unit test that is failing.

Coming from this documentation - https://docs.llamaindex.ai/en/stable/examples/workflow/function_calling_agent/

function_calling.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
3 participants