From 65114ba9e0267e43e31899ddcf17405a0b825b94 Mon Sep 17 00:00:00 2001 From: Scott Twiname Date: Thu, 19 Dec 2024 18:44:59 +1300 Subject: [PATCH] Update AI docs for OpenAI support (#578) * Update AI docs for OpenAI support * Update cli reference --- docs/ai/build/rag.md | 2 +- docs/ai/run/cli.md | 23 +++++++++++++++++++---- docs/ai/welcome.md | 8 +++++--- 3 files changed, 25 insertions(+), 8 deletions(-) diff --git a/docs/ai/build/rag.md b/docs/ai/build/rag.md index 6747bccbeae..0f3151d5322 100644 --- a/docs/ai/build/rag.md +++ b/docs/ai/build/rag.md @@ -11,7 +11,7 @@ Defining the RAG data set is largely up to the user to define. Currently only [L We do provide an off the shelf way to create a table from markdown files. This will parse and chunk the content appropriately and use the `nomic-embed-text` model to generate vectors. ```shell -subql-ai embed-mdx -i ./path/to/dir/with/markdown -o ./db --table your-table-name +subql-ai embed-mdx -i ./path/to/dir/with/markdown -o ./db --table your-table-name --model nomic-embed-text ``` ## Adding RAG to your app diff --git a/docs/ai/run/cli.md b/docs/ai/run/cli.md index be6d0e0f4e7..19472e6c42a 100644 --- a/docs/ai/run/cli.md +++ b/docs/ai/run/cli.md @@ -1,14 +1,15 @@ # CLI Reference ``` -Run an AI app +Run a SubQuery AI app Commands: - subql-ai Run an AI app [default] + subql-ai Run a SubQuery AI app [default] subql-ai info Get information on a project subql-ai embed-mdx Creates a Lance db table with embeddings from MDX files subql-ai repl Creates a CLI chat with a running app - subql-ai publish Publishes a project to IPFS so it can be easily distributed + subql-ai publish Publishes a project to IPFS so it can be easily + distributed subql-ai init Create a new project skeleton Options: @@ -19,8 +20,16 @@ Options: [string] [default: "https://unauthipfs.subquery.network/ipfs/api/v0/"] --ipfsAccessToken A bearer authentication token to be used with the ipfs endpoint [string] - -h, --host The ollama RPC host + --cacheDir The location to cache data from ipfs. Default is a temp + directory [string] + --debug Enable debug logging [boolean] [default: false] + --logFmt Set the logger format + [string] [choices: "json", "pretty"] [default: "pretty"] + -h, --host The LLM RPC host. If the project model uses an OpenAI + model then the default value is not used. [string] [default: "http://localhost:11434"] + --openAiApiKey If the project models use OpenAI models, then this api + key will be parsed on to the OpenAI client [string] -i, --interface The interface to interact with the app [string] [choices: "cli", "http"] [default: "http"] --port The port the http service runs on @@ -29,8 +38,14 @@ Options: use the cached version [boolean] [default: false] --toolTimeout Set a limit for how long a tool can take to run, unit is MS [number] [default: 10000] + --streamKeepAlive The interval in MS to send empty data in stream + responses to keep the connection alive. Only wokrs with + http interface. Use 0 to disable. + [number] [default: 5000] ``` +These can also be specified with environment variables. They should be prefixed with `SUBQL_AI_` and the flag renambed to capitalized snake case. E.g `SUBQL_AI_CACHE_DIR` + ### `subql-ai` Run an AI app. diff --git a/docs/ai/welcome.md b/docs/ai/welcome.md index ccf6c62bf75..a9aff4b522c 100644 --- a/docs/ai/welcome.md +++ b/docs/ai/welcome.md @@ -10,7 +10,7 @@ AI apps are self contained and easily scalable AI agents that you can use to pow - **Empower your AI with RAGs:** By integrating [RAG (Retrieval-Augmented Generation) files](./build/rag.md), your AI Apps can leverage domain-specific knowledge efficiently. With initial support for LanceDB and future compatibility with other vector databases, developers can enhance their applications' performance and accuracy. Additionally, publishing to IPFS ensures data integrity and accessibility. - **Your AI journey starts here:** The SubQuery AI App framework is designed with user-friendliness in mind, providing intuitive wrappers around core features. This lowers the barrier to entry for developers of all skill levels, making it easier to create, run, and deploy AI Apps. - **Connect, create, and integrate with function tooling:** You can extend your AI Apps with additional [function tooling](./build/function_tools.md), facilitating connections to external systems and tools. This capability enables rich integrations, allowing users to create versatile applications that can interact seamlessly with blockchains and other ecosystems. -- **Choose your model:** By supporting a range of open-source LLM models, starting with Ollama-compatible ones, the SubQuery AI App Framework ensures that users can choose the best model for their applications without being locked into a specific model ecosystem. This flexibility fosters open-source innovation. +- **Choose your model:** By supporting a range of open-source Ollama LLM models as well as, OpenAI, the SubQuery AI App Framework ensures that users can choose the best model for their applications without being locked into a specific model ecosystem. This flexibility fosters open-source innovation. - **Proven standards for seamless integration:** SubQuery AI Apps expose the industry-standard [OpenAI API](./query/query.md), ensuring compatibility with a wide range of applications and tools. This makes it easier for developers to integrate AI capabilities into their projects while adhering to established standards. ![AI App Framework Features](/assets/img/ai/features.jpg) @@ -22,7 +22,9 @@ AI apps are self contained and easily scalable AI agents that you can use to pow To use the framework there are a couple of dependencies: - [Deno](https://deno.land/). The SubQuery AI framework is built on Deno and is needed to build your app. -- [Ollama](https://ollama.com/). Alternatively an endpoint to an Ollama instance. +- An LLM + - [Ollama](https://ollama.com/). Alternatively an endpoint to an Ollama instance. + - [OpenAI](https://platform.openai.com). You will need a paid API Key. ### Install the framework @@ -38,7 +40,7 @@ You can confirm installation by running `subql-ai --help`. ## Create a new App -You can initialise a new app using `subql-ai init`. It will ask you to provide a name and a Ollama model to use. +You can initialise a new app using `subql-ai init`. It will ask you to provide a name and a LLM model to use. ![Init a new AI App](/assets/img/ai/guide-init.png)