From d73e7e869097f1e8bd24c74ed66c40dfbccba24d Mon Sep 17 00:00:00 2001 From: Karuna-Mendix Date: Mon, 7 Jul 2025 16:56:05 +0530 Subject: [PATCH 1/2] Language review for Azure AI Search --- content/en/docs/marketplace/genai/_index.md | 9 ++--- .../marketplace/genai/how-to/byo_connector.md | 2 +- .../mendix-cloud-genai/Mx GenAI Connector.md | 2 +- .../external-platforms/openai.md | 37 ++++++++++++++++++- .../genai/reference-guide/genai-commons.md | 2 +- 5 files changed, 42 insertions(+), 10 deletions(-) diff --git a/content/en/docs/marketplace/genai/_index.md b/content/en/docs/marketplace/genai/_index.md index 88d640e257c..23f54288c3a 100644 --- a/content/en/docs/marketplace/genai/_index.md +++ b/content/en/docs/marketplace/genai/_index.md @@ -46,11 +46,11 @@ Supercharge your applications with Mendix's Agents Kit. This powerful set of com | Asset | Description | Type | Studio Pro Version | |-------------------|---------------------------------------------------|----------------------------------|------------| -| [Agent Builder Starter App](https://marketplace.mendix.com/link/component/240369) | See an example of how to build an agentic Mendix app. Use the Agent Builder from Agent Commons to build your support assistant. | Starter App | 10.21 | +| [Agent Builder Starter App](https://marketplace.mendix.com/link/component/240369) (formerly known as Support Assistant Starter App) | See an example of how to build an agentic Mendix application. Use the Agent Builder from Agent Commons to build your support assistant. | Starter App | 10.24 | | [Agent Commons](/appstore/modules/genai/genai-for-mx/agent-commons/) | Build agentic functionality using common patterns in your application by defining, testing, and evaluating agents at runtime. | Common Module | 10.21 | | [AI Bot Starter App](https://marketplace.mendix.com/link/component/227926) | Lets you kick-start the development of enterprise-grade AI chatbot experiences. For example, you can use it to create your own private enterprise-ready ChatGPT-like app. | Starter App | 10.21 | | [Amazon Bedrock Connector](/appstore/modules/aws/amazon-bedrock/) | Connect to Amazon Bedrock. Use Retrieve and Generate or Bedrock agents. | Connector Module | 10.21 | -| [Blank GenAI App](https://marketplace.mendix.com/link/component/227934) | Start from scratch to create a new application with GenAI capabilities and without any dependencies. | Starter App | 10.21 | +| [Blank GenAI App](https://marketplace.mendix.com/link/component/227934) | Start from scratch to create a new application with GenAI capabilities and without any dependencies. | Starter App | 10.24 | | [Conversational UI](/appstore/modules/genai/conversational-ui/) | Create a Conversational UI or monitor token consumption in your app. | UI Module | 10.21 | | [GenAI Commons](/appstore/modules/genai/commons/) | Common capabilities that allow all GenAI connectors to be integrated with the other modules. You can also implement your own connector based on this. | Common Module | 10.21 | | [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475) | Understand what you can build with generative AI. Understand how to implement the Mendix Cloud GenAI, OpenAI, and Amazon Bedrock connectors and how to integrate them with the Conversational UI module. |Showcase App | 10.21 | @@ -58,9 +58,8 @@ Supercharge your applications with Mendix's Agents Kit. This powerful set of com | [Mendix Cloud GenAI Connector](/appstore/modules/genai/mx-cloud-genai/MxGenAI-connector/) | Connect to Mendix Cloud and utilize Mendix Cloud GenAI resource packs directly within your Mendix application. | Connector Module | 10.21 | | [OpenAI Connector](/appstore/modules/genai/openai/) | Connect to (Azure) OpenAI. | Connector Module | 10.21 | | [PgVector Knowledge Base](/appstore/modules/genai/pgvector/) | Manage and interact with a PostgreSQL *pgvector* Knowledge Base. | Connector Module | 10.21 | -| [RFP Assistant Starter App / Questionnaire Assistant Starter App](https://marketplace.mendix.com/link/component/235917) | The RFP Assistant Starter App and the Questionnaire Assistant Starter App leverage historical question-answer pairs (RFPs)) and a continuously updated knowledge base to generate and assist in editing responses to RFPs. This offers a time-saving alternative to manually finding similar responses and enhancing the knowledge management process. | Starter App | 10.21 | +| [RFP Assistant Starter App / Questionnaire Assistant Starter App](https://marketplace.mendix.com/link/component/235917) | The RFP Assistant Starter App and the Questionnaire Assistant Starter App leverage historical question-answer pairs (RFPs) and a continuously updated knowledge base to generate and assist in editing responses to RFPs. This offers a time-saving alternative to manually finding similar responses and enhancing the knowledge management process. | Starter App | 10.21 | | [Snowflake Showcase App](https://marketplace.mendix.com/link/component/225845) | Learn how to implement the Cortex functionalities in your app. | Showcase App | 10.21 | -| [Support Assistant Starter App](https://marketplace.mendix.com/link/component/231035) | Learn how to combine common GenAI patterns, such as function calling and RAG to build your support assistant. Connect it to a model like Anthropic Claude via Mendix Cloud GenAI or Amazon Bedrock or use an Azure OpenAI subscription. | Starter App | 10.21 | Older versions of the marketplace modules and GenAI Showcase App are available in Studio Pro 9.24.2. @@ -72,7 +71,7 @@ Mendix connectors offer direct support for the following models: |--------------|---------------------|---------------------|-------------------|-----------|-------------------------| | Mendix Cloud GenAI | Anthropic Claude 3.5 Sonnet | Chat Completions | text, image, document | text | Function calling | | | Cohere Embed English, Cohere Embed Multilingual | Embeddings | text | embeddings | | -| Azure / OpenAI | gpt-4, gpt-4-turbo, gpt-4o, gpt-4o mini, gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, gpt-4.5-preview | Chat completions | text, image, document (OpenAI only) | text | Function calling | +| Azure / OpenAI | gpt-4, gpt-4-turbo, gpt-4o, gpt-4o mini, gpt-4.1, gpt-4.1-mini, gpt-4.1-nano | Chat completions | text, image, document (OpenAI only) | text | Function calling | | | DALL·E 2, DALL·E 3, gpt-image-1 | Image generation | text | image | | | | text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large | Embeddings | text | embeddings| | | Amazon Bedrock | Amazon Titan Text G1 - Express, Amazon Titan Text G1 - Lite, Amazon Titan Text G1 - Premier | Chat Completions | text, document (except Titan Premier) | text | | diff --git a/content/en/docs/marketplace/genai/how-to/byo_connector.md b/content/en/docs/marketplace/genai/how-to/byo_connector.md index 6949df3ac22..0062f216593 100644 --- a/content/en/docs/marketplace/genai/how-to/byo_connector.md +++ b/content/en/docs/marketplace/genai/how-to/byo_connector.md @@ -12,7 +12,7 @@ If you want to create your own connection to the LLM model of your choice while Building your own GenAI Commons connector offers several practical benefits that streamline development and enhance flexibility. You can reuse [ConversationalUI](/appstore/modules/genai/genai-for-mx/conversational-ui/) components, quickly set up with [starter apps](/appstore/modules/genai/how-to/starter-template/), and switch providers effortlessly. This guide will help you integrate your preferred LLM while maintaining a seamless and user-friendly chat experience. -{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/connectors_diagram.jpg" >}} +{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/connectors_diagram.svg" >}} ### Prerequisites diff --git a/content/en/docs/marketplace/genai/mendix-cloud-genai/Mx GenAI Connector.md b/content/en/docs/marketplace/genai/mendix-cloud-genai/Mx GenAI Connector.md index 174fd2bc444..22e61eea244 100644 --- a/content/en/docs/marketplace/genai/mendix-cloud-genai/Mx GenAI Connector.md +++ b/content/en/docs/marketplace/genai/mendix-cloud-genai/Mx GenAI Connector.md @@ -38,7 +38,7 @@ The module enables tailoring generated responses to specific contexts by groundi Knowledge bases are often used for: -1. [Retrieval Augmented Generation (RAG)](https://docs.mendix.com/appstore/modules/genai/rag/) retrieves relevant knowledge from the knowledge base, incorporates it into a prompt, and sends it to the model to generate a response. +1. [Retrieval Augmented Generation (RAG)](/appstore/modules/genai/rag/) retrieves relevant knowledge from the knowledge base, incorporates it into a prompt, and sends it to the model to generate a response. 2. Semantic search enables advanced search capabilities by considering the semantic meaning of the text, going beyond exact and approximate matching. It allows the knowledge base to be searched for similar chunks effectively. If you are looking for a step-by-step guide on how to get your application data into a Mendix Cloud Knowledge Base, refer [Grounding Your Large Language Model in Data – Mendix Cloud GenAI](/appstore/modules/genai/how-to/howto-groundllm/). Note that the Mendix Portal also provides options for importing data into your knowledge base, such as file uploads. For more information, see [Navigate through the Mendix Cloud GenAI Portal](/appstore/modules/genai/mx-cloud-genai/Navigate-MxGenAI/). This documentation focuses solely on adding data from an application using the connector. diff --git a/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md b/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md index 02c5041f905..f4f61583fee 100644 --- a/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md +++ b/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md @@ -13,7 +13,7 @@ aliases: The [OpenAI Connector](https://marketplace.mendix.com/link/component/220472) allows you to integrate generative AI into your Mendix app. It is compatible with [OpenAI's platform](https://platform.openai.com/) as well as [Azure's OpenAI service](https://oai.azure.com/). -The current scope covers text generation use cases based on the [OpenAI Chat Completions API](https://platform.openai.com/docs/api-reference/chat), image generation use cases based on the [Image Generations API](https://platform.openai.com/docs/api-reference/images), and embedding use cases based on the [Embeddings API](https://platform.openai.com/docs/api-reference/embeddings). +The current scope covers text generation use cases based on the [OpenAI Chat Completions API](https://platform.openai.com/docs/api-reference/chat), image generation use cases based on the [Image Generations API](https://platform.openai.com/docs/api-reference/images), and embedding use cases based on the [Embeddings API](https://platform.openai.com/docs/api-reference/embeddings). Furthermore, indexes via [Azure AI Search](https://learn.microsoft.com/en-us/azure/search/) can be used for knowledge base retrieval. Mendix provides dual-platform support for both OpenAI and Azure OpenAI. @@ -66,11 +66,15 @@ Combine embeddings with text generation capabilities and leverage specific sourc For more information on how to set up a vector database, see [Retrieval Augmented Generation (RAG)](/appstore/modules/genai/rag/). Also, check out the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475) from the Marketplace for an example implementation. {{% /alert %}} +#### Knowledge Base + +By integrating Azure AI Search, the OpenAI Connector allows for knowledge base retrieval from Azure datsources. The most common use case is retrieval augmented generation (RAG) to retrieve relevant knowledge from the knowledge base, incorporating it into a prompt, and sending it to the model to generate a response. + ### Features {#features} Mendix provides dual-platform support for both [OpenAI](https://platform.openai.com/) and [Azure OpenAI](https://oai.azure.com/). -With the current version, Mendix supports the Chat Completions API for [text generation](https://platform.openai.com/docs/guides/text-generation), the Image Generations API for [images](https://platform.openai.com/docs/guides/images), and the Embeddings API for [vector embeddings](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings). +With the current version, Mendix supports the Chat Completions API for [text generation](https://platform.openai.com/docs/guides/text-generation), the Image Generations API for [images](https://platform.openai.com/docs/guides/images), the Embeddings API for [vector embeddings](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings), and indexes via [Azure AI Search](https://learn.microsoft.com/en-us/azure/search/) for knowledge base retrieval. ### Prerequisites {#prerequisites} @@ -143,6 +147,21 @@ The following inputs are required for the Azure OpenAI configuration: 4. Make sure the right Azure OpenAI resource is selected. 5. You can now view ({{% icon name="view" %}}) and copy ({{% icon name="copy" %}}) the value of the **key1** or **key2** field as your API key while setting up the configuration. Note that these keys might not be visible for everyone in the Azure OpenAI Portal, depending on your organization's security settings. +##### Adding Azure AI Search Resources {#azure-ai-search} + +| Parameter | Value | +| -------------- | ------------------------------------------------------------ | +| Display name | This is the name identifier of a Azure AI Search Resource (for example, *MySearchResource*). | +| Endpoint URL | This is the API endpoint (for example, `https://your-resource-name.search.windows.net`).
For details on how to obtain `your-resource-name`, see [Azure AI Search service in the Azure portal](https://learn.microsoft.com/en-us/azure/search/search-create-service-portal). | +| API version | This is the version of the REST API. | +| API key | This is the access token to authorize your API call. | + +After saving, the indexes in this resource will be automatically synced and displayed in the configuration page. They will all be separate indexes that can be added to the request when using Chat completions. + +{{% alert color="warning" %}} +Currently, the only supported authorization method for Azure AI Search resources is the API key. +{{% /alert %}} + #### Configuring the OpenAI Deployed Models A [Deployed Model](/appstore/modules/genai/genai-for-mx/commons/#deployed-model) represents a GenAI model instance that can be used by the app to generate text, embeddings, or images. For every model you want to invoke from your app, you need to create a `OpenAIDeployedModel` record, a specialization of `DeployedModel`. In addition to the model display name and a technical name/identifier, an OpenAI deployed model contains a reference to the additional connection details as configured in the previous step. For OpenAI, a set of common models will be prepopulated automatically upon saving the configuration. If you want to use additional models that are made available by OpenAI you need to configure additional OpenAI deployed models in your Mendix app. For Azure OpenAI no deployed models are created by default. The technical model names depend on the deployment names that were chosen while deploying the models in the [Azure Portal](https://oai.azure.com/resource/deployments). Therefore in this case you always need to configure the deployed models manually in your Mendix app. @@ -201,6 +220,16 @@ Mendix also strongly advises that you build user confirmation logic into functio For more information, see [Function Calling](/appstore/modules/genai/function-calling/). +#### Index {#chatcompletions-index} + +Adding Azure indexes to a call enables LLMs to retrieve information when a related topics are mentioned. By including these indexes in the request object along with a name and description, enables the model to intelligently decide when to let the Mendix app call one or more predefined indexes. This allows the assistant to include the additional information in its response. + +OpenAI does not directly connect to the Azure AI Search resource. The model returns a tool called JSON structure that is used to build the input of the retrievals so that they can be executed as part of the chat completions operation. The OpenAI connector takes care of handling the tool call response as well as executing the function microflows until the API returns the assistant's final response. + +This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, you need to make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per index before passing the request to the Chat Completions operation. + +Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling. + #### Vision {#chatcompletions-vision} Vision enables models like GPT-4o and GPT-4 Turbo to interpret and analyze images, allowing them to answer questions and perform tasks related to visual content. This integration of computer vision and language processing enhances the model's comprehension and makes it valuable for tasks involving visual information. To make use of vision inside the OpenAI connector, an optional [FileCollection](/appstore/modules/genai/genai-for-mx/commons/#filecollection) containing one or multiple images must be sent along with a single message. @@ -305,6 +334,10 @@ All [tool choice types](/appstore/modules/genai/genai-for-mx/commons/#enum-toolc | none | none | | tool | tool | +### Knowledge Base Retrieval + +When adding a [KnowledgeBaseRetrieval](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request) object to your request, there are some optional parameters. Currently, only the MaxNumberOfResults parameter can be added to the search call and the others (`MinimumSimilarity` and `MetadataCollection`) are not compatible with the OpenAI Connector. + ## GenAI showcase Application {#showcase-application} For more inspiration or guidance on how to use those microflows in your logic, Mendix recommends downloading the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475), which demonstrates a variety of example use cases. diff --git a/content/en/docs/marketplace/genai/reference-guide/genai-commons.md b/content/en/docs/marketplace/genai/reference-guide/genai-commons.md index e5e87c74802..d9e42f73df6 100644 --- a/content/en/docs/marketplace/genai/reference-guide/genai-commons.md +++ b/content/en/docs/marketplace/genai/reference-guide/genai-commons.md @@ -544,7 +544,7 @@ This tool adds a function that performs a retrieval from a knowledge base to a [ | Name | Type | Notes | Description | |---|---|---|---| | `Request` | [Request](#request) | mandatory | The request to which the knowledge base should be added. | -| `Name` | String | mandatory | The name of the knowledge base to use or call. | +| `Name` | String | mandatory | The name of the knowledge base to use or call. Technically, this is the name of the tool that is passed to the LLM. This needs to be unique per request (if multiple tools/knowledge base retrievals are added). | | `Description` | String | optional | A description of the knowledge base's purpose, used by the model to determine when and how to invoke it. | | `DeployedKnowledgeBase` | Object | mandatory | The knowledge base that is called within this tool. This object includes a `microflow`, which is executed when the knowledge base is invoked. | | `MaxNumberOfResults` | Integer | optional | This can be used to limit the number of results that should be retrieved. | From 207185566be2f6264b8345b5a8c1b55e71e4071f Mon Sep 17 00:00:00 2001 From: Karuna-Mendix Date: Tue, 8 Jul 2025 10:31:30 +0530 Subject: [PATCH 2/2] fix the broken image --- content/en/docs/marketplace/genai/how-to/byo_connector.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/en/docs/marketplace/genai/how-to/byo_connector.md b/content/en/docs/marketplace/genai/how-to/byo_connector.md index 0062f216593..6949df3ac22 100644 --- a/content/en/docs/marketplace/genai/how-to/byo_connector.md +++ b/content/en/docs/marketplace/genai/how-to/byo_connector.md @@ -12,7 +12,7 @@ If you want to create your own connection to the LLM model of your choice while Building your own GenAI Commons connector offers several practical benefits that streamline development and enhance flexibility. You can reuse [ConversationalUI](/appstore/modules/genai/genai-for-mx/conversational-ui/) components, quickly set up with [starter apps](/appstore/modules/genai/how-to/starter-template/), and switch providers effortlessly. This guide will help you integrate your preferred LLM while maintaining a seamless and user-friendly chat experience. -{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/connectors_diagram.svg" >}} +{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/connectors_diagram.jpg" >}} ### Prerequisites