A WildFly Chat Bot. This WildFly Bootable jar application is a web based UI allowing you to interact with your WildFly servers using natural language.
By default the file ./mcp.json
is read. You can configure it with -Dorg.wildfly.ai.chatbot.mcp.config=<path to file>
-
Build the WildFly MCP server located in
../wildfly-mcp-server
(The chat bot will use it in its default mcp.json configuration). -
Build the WildFly chat bot:
mvn clean install
- Start the chat bot using local
ollama
(by default uses theqwen2.5:3b
) model, once started it listens on port8090
:
java -jar target/wildfly-chat-bot-bootable.jar
This chatbot has also been tried with the llama3.2:3b
and provided good results.
- Start the chat bot using groq, once started it listens on port
8090
:
GROQ_CHAT_MODEL_NAME=llama3-70b-8192 GROQ_API_KEY=<Your groq key> java -jar target/wildfly-chat-bot-bootable.jar -Dorg.wildfly.ai.chatbot.llm.name=groq
Env Variable | Description |
---|---|
OLLAMA_CHAT_URL | URL, default value http://127.0.0.1:11434 |
OLLAMA_CHAT_MODEL_NAME | ollama model, default value qwen2.5:3b |
OLLAMA_CHAT_TEMPERATURE | model temperature, by default 0.9 |
OLLAMA_CHAT_LOG_REQUEST | log requests, by default true |
OLLAMA_CHAT_LOG_RESPONSE | log responses, by default true |
Env Variable | Description |
---|---|
GROQ_API_KEY | Your API key |
GROQ_CHAT_URL | URL, default value http://127.0.0.1:11434 |
GROQ_CHAT_MODEL_NAME | model, default value qwen2.5:3b |
GROQ_CHAT_LOG_REQUEST | log requests, by default true |
GROQ_CHAT_LOG_RESPONSE | log responses, by default true |