From 1736097a5902ef39fd289a645ad9765058f1c8b3 Mon Sep 17 00:00:00 2001 From: Jerry Liu Date: Fri, 29 Dec 2023 15:38:09 -0800 Subject: [PATCH] nit: add llmcompiler as agent guide (#9747) --- docs/module_guides/deploying/agents/modules.md | 9 +++++++++ docs/module_guides/deploying/agents/root.md | 1 + 2 files changed, 10 insertions(+) diff --git a/docs/module_guides/deploying/agents/modules.md b/docs/module_guides/deploying/agents/modules.md index f43d6b7de9331..1813a928c83da 100644 --- a/docs/module_guides/deploying/agents/modules.md +++ b/docs/module_guides/deploying/agents/modules.md @@ -42,6 +42,15 @@ maxdepth: 1 /examples/agent/react_agent_with_query_engine.ipynb ``` +## Additional Agents (available on LlamaHub) + +```{toctree} +--- +maxdepth: 1 +--- +LLMCompiler Agent Cookbook +``` + (lower-level-agent-api)= ## Lower-Level Agent API diff --git a/docs/module_guides/deploying/agents/root.md b/docs/module_guides/deploying/agents/root.md index 9a289aa3962eb..82dbbd3969a17 100644 --- a/docs/module_guides/deploying/agents/root.md +++ b/docs/module_guides/deploying/agents/root.md @@ -22,6 +22,7 @@ The reasoning loop depends on the type of agent. We have support for the followi - OpenAI Function agent (built on top of the OpenAI Function API) - a ReAct agent (which works across any chat/text completion endpoint). +- a LLMCompiler Agent (available as a [LlamaPack](https://llamahub.ai/l/llama_packs-agents-llm_compiler?from=llama_packs), [source repo](https://github.com/SqueezeAILab/LLMCompiler)) ### Tool Abstractions