diff --git a/docs/tutorials/weaviate.md b/docs/tutorials/weaviate.md index 221bef09..bd4ce417 100644 --- a/docs/tutorials/weaviate.md +++ b/docs/tutorials/weaviate.md @@ -3,12 +3,14 @@ Weaviate is a vector search engine that can integrate seamlessly with KubeAI's embedding and generative models. This tutorial demonstrates how to deploy both KubeAI and Weaviate in a Kubernetes cluster, using KubeAI as the OpenAI endpoint for Weaviate. Why use KubeAI with Weaviate? + - Security and privacy: KubeAI runs locally in your Kubernetes cluster, so your data never leaves your infrastructure. - Cost savings: KubeAI can run on your existing hardware, reducing the need for paying for embeddings and generative models. This tutorial uses CPU only models, so it should work even on your laptop. As you go go through this tutorial, you will learn how to: + - Deploy KubeAI with embedding and generative models - Install Weaviate and connect it to KubeAI - Import data into Weaviate