diff --git a/source/_static/badges/academy-copilot-setup.rst b/source/_static/badges/academy-copilot-setup.rst new file mode 100644 index 00000000000..af6d89857a5 --- /dev/null +++ b/source/_static/badges/academy-copilot-setup.rst @@ -0,0 +1,12 @@ +:orphan: +:nosearch: + +.. raw:: html + + + +
+ Mattermost Academy + Learn about setting up and configuring Mattermost Copilot with multiple LLMs +
+
\ No newline at end of file diff --git a/source/configure/enable-copilot.rst b/source/configure/enable-copilot.rst index 6ad7553870f..ecbaa4b31aa 100644 --- a/source/configure/enable-copilot.rst +++ b/source/configure/enable-copilot.rst @@ -6,6 +6,9 @@ Enable Copilot Signficantly increase team productivity and decision-making speed by enhancing your real-time collaboration capabilities with instant access to AI-generated information, discussion summaries, and contextually-aware action recommendations with Mattermost's Copilot. Your users can interact with AI capabilities directly within their daily communication channels without needing to switch between multiple tools or platforms +.. include:: ../_static/badges/academy-copilot-setup.rst + :start-after: :nosearch: + Setup ------ @@ -82,7 +85,7 @@ Configure a large language model (LLM) for your Copilot integration by going to 1. Deploy your model, for example, on `Ollama `_. 2. Select **OpenAI Compatible** in the **AI Service** dropdown. - 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. + 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append ``/v1`` to the end of the URL if using Ollama. (e.g., ``http://localhost:11434/v1`` for Ollama, otherwise ``http://localhost:11434/``) 4. If using Ollama, leave the **API Key** field blank. 5. Specify your model name in the **Default Model** field.