Skip to content

Commit

Permalink
feat: system prompt
Browse files Browse the repository at this point in the history
LGTM
  • Loading branch information
scenaristeur authored Jan 18, 2024
1 parent a9d2e3c commit 3b0d750
Show file tree
Hide file tree
Showing 8 changed files with 55 additions and 4 deletions.
2 changes: 2 additions & 0 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ You can config the model by the following steps:

[LLamaChatPromptOptions](https://withcatai.github.io/node-llama-cpp/api/type-aliases/LLamaChatPromptOptions)

You can edit the [systemPrompt](system_prompt.md) of the chat too.


3. Restart the server.
![Restart Button](./configuration/restart-button.png)
Expand Down
15 changes: 15 additions & 0 deletions docs/system_prompt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# CatAi system_prompt

According to https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaChatSessionOptions,
it is possible to modify the system_prompt of a chat.

This can be achieved by adding a systemPrompt key in modelSettings

![catAi systemPrompt settings](system_prompt/settings.png)


Save and restart to apply.

Then the chat act like a pirate according to the systemPrompt you choose ;-)

![catAi systemPrompt demo](system_prompt/demo.png)
Binary file added docs/system_prompt/demo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/system_prompt/settings.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
24 changes: 23 additions & 1 deletion models.json
Original file line number Diff line number Diff line change
Expand Up @@ -441,5 +441,27 @@
"contextSize": 4096
},
"version": 1
},
"dolphin-2.2.1-mistral-7b-q4_0": {
"download": {
"files": {
"model": "dolphin-2.2.1-mistral-7b.Q4_0.gguf"
},
"repo": "https://huggingface.co/TheBloke/dolphin-2.2.1-mistral-7B-GGUF",
"commit": "34548d6105f2ffdc35b066364a954d42c24a2d87",
"branch": "main"
},
"hardwareCompatibility": {
"ramGB": 4.4,
"cpuCors": 2,
"compressions": "q4_0"
},
"compatibleCatAIVersionRange": [
"3.0.2"
],
"settings": {
"bind": "node-llama-cpp-v2"
},
"version": 1.1
}
}
}
6 changes: 6 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import {ChatContext} from '../../../chat-context.js';
import createChatWrapper from './chat-wrapper/chat-wrapper.js';
import type NodeLlamaCpp from './node-llama-cpp-v2.js';
import type {LlamaChatSession, LlamaModel} from 'node-llama-cpp';
import createChatWrapper from './chat-wrapper/chat-wrapper.js';
import {ChatContext} from '../../../chat-context.js';

export default class NodeLlamaCppChat extends ChatContext {
private _session: LlamaChatSession;
Expand All @@ -11,6 +11,9 @@ export default class NodeLlamaCppChat extends ChatContext {
this._session = new _package.LlamaChatSession({
context: new _package.LlamaContext({model}),
promptWrapper: createChatWrapper(_package, _parent.modelSettings.settings?.wrapper),
systemPrompt: _parent.modelSettings.settings?.systemPrompt,
printLLamaSystemInfo: _parent.modelSettings.settings?.printLLamaSystemInfo,
conversationHistory: _parent.modelSettings.settings?.conversationHistory
});
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,13 @@
import type {LlamaModel, LlamaModelOptions} from 'node-llama-cpp';
import type {ConversationInteraction, LlamaModel, LlamaModelOptions} from 'node-llama-cpp';
import NodeLlamaCppChat from './node-llama-cpp-chat.js';
import BaseBindClass from '../../base-bind-class.js';

type NodeLlamaCppOptions = Omit<LlamaModelOptions, 'modelPath'> & {
wrapper?: string,
maxTokens?: number,
printLLamaSystemInfo?: boolean,
systemPrompt?: string,
conversationHistory?: readonly ConversationInteraction[],
};

export default class NodeLlamaCppV2 extends BaseBindClass<NodeLlamaCppOptions> {
Expand Down

0 comments on commit 3b0d750

Please sign in to comment.