You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
server : revamp chat UI with vuejs and daisyui (ggml-org#10175)
* server : simple chat UI with vuejs and daisyui
* move old files to legacy folder
* embed deps into binary
* basic markdown support
* add conversation history, save to localStorage
* fix bg-base classes
* save theme preferences
* fix tests
* regenerate, edit, copy buttons
* small fixes
* docs: how to use legacy ui
* better error handling
* make CORS preflight more explicit
* add GET method for CORS
* fix tests
* clean up a bit
* better auto scroll
* small fixes
* use collapse-arrow
* fix closeAndSaveConfigDialog
* small fix
* remove console.log
* fix style for <pre> element
* lighter bubble color (less distract when reading)
Copy file name to clipboardexpand all lines: examples/server/README.md
+10
Original file line number
Diff line number
Diff line change
@@ -928,6 +928,16 @@ Apart from error types supported by OAI, we also have custom types that are spec
928
928
}
929
929
```
930
930
931
+
### Legacy completion web UI
932
+
933
+
A new chat-based UI has replaced the old completion-based since [this PR](https://github.com/ggerganov/llama.cpp/pull/10175). If you want to use the old completion, start the server with `--path ./examples/server/public_legacy`
### Extending or building alternative Web Front End
932
942
933
943
You can extend the front end by running the server binary with `--path`set to `./your-directory` and importing `/completion.js` to get access to the llamaComplete() method.
0 commit comments