Skip to content

Commit

Permalink
Add customised prompts for chat view
Browse files Browse the repository at this point in the history
  • Loading branch information
taichimaeda committed Apr 21, 2024
1 parent c66aa52 commit 6b0e46b
Show file tree
Hide file tree
Showing 46 changed files with 179 additions and 59 deletions.
8 changes: 5 additions & 3 deletions src/api/clients/openai-compatible.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,10 @@ export abstract class OpenAICompatibleAPIClient implements APIClient {

const { settings } = this.plugin;
try {
const prompt = this.generator.generateChatPrompt(messages);
console.log('prompt', prompt);
const stream = await this.openai.chat.completions.create({
messages,
messages: prompt,
model: settings.chat.model,
max_tokens: settings.chat.maxTokens,
temperature: settings.chat.temperature,
Expand Down Expand Up @@ -72,9 +74,9 @@ export abstract class OpenAICompatibleAPIClient implements APIClient {

const { settings } = this.plugin;
try {
const messages = this.generator.generatePrompt(prefix, suffix);
const prompt = this.generator.generateCompletionsPrompt(prefix, suffix);
const completions = await this.openai.chat.completions.create({
messages,
messages: prompt,
model: settings.completions.model,
max_tokens: settings.completions.maxTokens,
temperature: settings.completions.temperature,
Expand Down
19 changes: 19 additions & 0 deletions src/api/prompts/chat/example1/assistant.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Sure! Here is a Python implementation of Kadane's algorithm:

```python
def kadane_algorithm(arr):
max_sum = float('-inf')
current_sum = 0

for num in arr:
current_sum = max(num, current_sum + num)
max_sum = max(max_sum, current_sum)

return max_sum

# Example usage
arr = [-2, 1, -3, 4, -1, 2, 1, -5, 4]
print(kadane_algorithm(arr)) # Output: 6
```

This implementation finds the maximum sum of a contiguous subarray within the given array arr using Kadane's algorithm.
1 change: 1 addition & 0 deletions src/api/prompts/chat/example1/user.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Show me a Python implementation of Kadane's algorithm.
16 changes: 16 additions & 0 deletions src/api/prompts/chat/example2/assistant.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
The Navier-Stokes equation is given by:

$$
\rho \left( \frac{\partial \textbf{u}}{\partial t} + \textbf{u} \cdot \nabla \textbf{u} \right) = - \nabla p + \mu \nabla^2 \textbf{u} + \textbf{f}
$$

where:

- $\rho$ is the fluid density,
- $\textbf{u}$ is the fluid velocity vector,
- $t$ is time,
- $p$ is the pressure,
- $\mu$ is the dynamic viscosity,
- $\nabla$ is the gradient operator,
- $\nabla^2$ is the Laplacian operator, and
- $\textbf{f}$ is the external force per unit volume.
1 change: 1 addition & 0 deletions src/api/prompts/chat/example2/user.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Show me the Navier-Stokes equation using LaTeX.
20 changes: 20 additions & 0 deletions src/api/prompts/chat/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import { FewShotPrompt } from '..';
import example1Assistant from './example1/assistant.md';
import example1User from './example1/user.md';
import example2Assistant from './example2/assistant.md';
import example2User from './example2/user.md';
import system from './system.txt';

export const CHAT_PROMPT: FewShotPrompt = {
system,
examples: [
{
user: example1User,
assistant: example1Assistant,
},
{
user: example2User,
assistant: example2Assistant,
},
],
};
4 changes: 4 additions & 0 deletions src/api/prompts/chat/system.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Answer the user's question.
Code blocks must be formatted using triple backticks (```), and the language name must be specified.
Math blocks must be formatted using double dollar signs ($$).
Inline math must be formatted using single dollar signs ($).
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import system from './system.txt';

export const ListItemPrompt: FewShotPrompt = {
export const BLOCK_QUOTE_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import example2Assistant from './example2/assistant.txt';
Expand All @@ -7,7 +7,7 @@ import example3Assistant from './example3/assistant.txt';
import example3User from './example3/user.md';
import system from './system.txt';

export const CodeBlockPrompt: FewShotPrompt = {
export const CODE_BLOCK_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import example2Assistant from './example2/assistant.txt';
import example2User from './example2/user.md';
import system from './system.txt';

export const MathBlockPrompt: FewShotPrompt = {
export const HEADING_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import system from './system.txt';

export const BlockQuotePrompt: FewShotPrompt = {
export const LIST_ITEM_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import example2Assistant from './example2/assistant.txt';
import example2User from './example2/user.md';
import system from './system.txt';

export const HeadingPrompt: FewShotPrompt = {
export const MATH_BLOCK_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import { FewShotPrompt } from '..';
import { FewShotPrompt } from '../..';
import example1Assistant from './example1/assistant.txt';
import example1User from './example1/user.md';
import system from './system.txt';

export const ParagraphPrompt: FewShotPrompt = {
export const PARAGRAPH_PROMPT: FewShotPrompt = {
system,
examples: [
{
Expand Down
File renamed without changes.
118 changes: 75 additions & 43 deletions src/api/prompts/generator.ts
Original file line number Diff line number Diff line change
@@ -1,40 +1,79 @@
import Markpilot from 'src/main';
import { FewShotPrompt } from '.';
import { ChatMessage } from '..';
import { BlockQuotePrompt } from './block-quote';
import { CodeBlockPrompt } from './code-block';
import { CHAT_PROMPT } from './chat';
import { BLOCK_QUOTE_PROMPT } from './completions/block-quote';
import { CODE_BLOCK_PROMPT } from './completions/code-block';
import { HEADING_PROMPT } from './completions/heading';
import { LIST_ITEM_PROMPT } from './completions/list-item';
import { MATH_BLOCK_PROMPT } from './completions/math-block';
import { PARAGRAPH_PROMPT } from './completions/paragraph';
import { Context, getContext, getLanguage } from './context';
import { HeadingPrompt } from './heading';
import { ListItemPrompt } from './list-item';
import { MathBlockPrompt } from './math-block';
import { ParagraphPrompt } from './paragraph';

const PROMPTS: Record<Context, FewShotPrompt> = {
heading: HeadingPrompt,
paragraph: ParagraphPrompt,
'list-item': ListItemPrompt,
'block-quote': BlockQuotePrompt,
'math-block': MathBlockPrompt,
'code-block': CodeBlockPrompt,
const COMPLETIONS_PROMPTS: Record<Context, FewShotPrompt> = {
heading: HEADING_PROMPT,
paragraph: PARAGRAPH_PROMPT,
'list-item': LIST_ITEM_PROMPT,
'block-quote': BLOCK_QUOTE_PROMPT,
'math-block': MATH_BLOCK_PROMPT,
'code-block': CODE_BLOCK_PROMPT,
};

export class PromptGenerator {
constructor(private plugin: Markpilot) {}

parseResponse(content: string) {
const lines = content.split('\n');
return lines.slice(lines.indexOf('<INSERT>') + 1).join('\n');
generateChatPrompt(messages: ChatMessage[]) {
const prompt = CHAT_PROMPT;
const system = prompt.system;

return [
{
role: 'system',
content: system,
},
...this.makeChatExamples(),
...messages,
] as ChatMessage[];
}

makeExamples(prefix: string, suffix: string) {
generateCompletionsPrompt(prefix: string, suffix: string) {
const { settings } = this.plugin;

if (!settings.completions.fewShot) {
const context = getContext(prefix, suffix);
const prompt = COMPLETIONS_PROMPTS[context];
const system =
context === 'code-block'
? prompt.system.replace('{{LANGUAGE}}', getLanguage(prefix, suffix)!)
: prompt.system;

const windowSize = settings.completions.windowSize;
const truncatedPrefix = prefix.slice(
prefix.length - windowSize / 2,
prefix.length,
);
const truncatedSuffix = suffix.slice(0, windowSize / 2);

return [
{
role: 'system',
content: system,
},
...this.makeCompletionsExamples(prefix, suffix),
{
role: 'user',
content: `${truncatedPrefix}<MASK>${truncatedSuffix}`,
},
] as ChatMessage[];
}

makeChatExamples() {
const { settings } = this.plugin;

if (!settings.chat.fewShot) {
return [];
}

const context = getContext(prefix, suffix);
const prompt = PROMPTS[context];
const prompt = CHAT_PROMPT;
return prompt.examples.flatMap((example) => [
{
role: 'user',
Expand All @@ -47,36 +86,29 @@ export class PromptGenerator {
]);
}

makeRequest(prefix: string, suffix: string): string {
makeCompletionsExamples(prefix: string, suffix: string) {
const { settings } = this.plugin;

const windowSize = settings.completions.windowSize;
const truncatedPrefix = prefix.slice(
prefix.length - windowSize / 2,
prefix.length,
);
const truncatedSuffix = suffix.slice(0, windowSize / 2);
return `${truncatedPrefix}<MASK>${truncatedSuffix}`;
}
if (!settings.completions.fewShot) {
return [];
}

generatePrompt(prefix: string, suffix: string) {
const context = getContext(prefix, suffix);
const prompt = PROMPTS[context];
const system =
context === 'code-block'
? prompt.system.replace('{{LANGUAGE}}', getLanguage(prefix, suffix)!)
: prompt.system;

return [
const prompt = COMPLETIONS_PROMPTS[context];
return prompt.examples.flatMap((example) => [
{
role: 'system',
content: system,
role: 'user',
content: example.user,
},
...this.makeExamples(prefix, suffix),
{
role: 'user',
content: this.makeRequest(prefix, suffix),
role: 'assistant',
content: example.assistant,
},
] as ChatMessage[];
]);
}

parseResponse(content: string) {
const lines = content.split('\n');
return lines.slice(lines.indexOf('<INSERT>') + 1).join('\n');
}
}
4 changes: 3 additions & 1 deletion src/chat/App.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,9 @@ export function App({
useEffect(() => {
if (turn === 'assistant') {
(async () => {
for await (const chunk of fetcher(history.messages)) {
// Ignores the first message which is the system prompt.
const messages = history.messages.slice(1);
for await (const chunk of fetcher(messages)) {
setHistory((history) => ({
...history,
response: history.response + chunk,
Expand Down
18 changes: 18 additions & 0 deletions src/settings/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ export interface MarkpilotSettings {
enabled: boolean;
provider: Provider;
model: Model;
fewShot: boolean;
maxTokens: number;
temperature: number;
history: ChatHistory;
Expand Down Expand Up @@ -91,6 +92,8 @@ export const DEFAULT_SETTINGS: MarkpilotSettings = {
enabled: true,
provider: DEFAULT_PROVIDER,
model: DEFAULT_MODELS[DEFAULT_PROVIDER],
// Few-shot prompts are still in beta
fewShot: false,
maxTokens: 1024,
temperature: 0.5,
history: {
Expand Down Expand Up @@ -480,6 +483,21 @@ export class MarkpilotSettingTab extends PluginSettingTab {
});
});

new Setting(containerEl)
.setName('Few-shot prompts (Beta)')
.setDesc(
'Turn this on to enable few-shot prompts for chat view. This is a beta feature and may not work as expected.',
)
.addToggle((toggle) =>
toggle
.setDisabled(!settings.chat.enabled)
.setValue(settings.chat.fewShot)
.onChange(async (value) => {
settings.chat.fewShot = value;
await plugin.saveSettings();
}),
);

new Setting(containerEl)
.setName('Max tokens')
.setDesc('Set the max tokens for chat view.')
Expand Down
1 change: 1 addition & 0 deletions src/settings/migrators/1.1.0-1.2.0.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@ const version1_2_0: MarkpilotSettings1_2_0 = {
enabled: true,
provider: 'openai',
model: 'gpt-4',
fewShot: false,
maxTokens: 10,
temperature: 1,
history: {
Expand Down
2 changes: 2 additions & 0 deletions src/settings/migrators/1.1.0-1.2.0.ts
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ export const migrateVersion1_1_0_toVersion1_2_0: SettingsMigrator<
...settings.chat,
provider: 'openai',
model: 'gpt-3.5-turbo',
// Few-shot prompts are still in beta
fewShot: false,
},
cache: {
enabled: true, // Enable cache by default.
Expand Down
1 change: 1 addition & 0 deletions src/settings/versions/1.2.0/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ export interface MarkpilotSettings1_2_0 {
enabled: boolean;
provider: Provider;
model: Model;
fewShot: boolean;
maxTokens: number;
temperature: number;
history: ChatHistory;
Expand Down
Loading

0 comments on commit 6b0e46b

Please sign in to comment.