This repository has been archived by the owner on Sep 12, 2024. It is now read-only.
v0.1.6
Pre-release
Pre-release
What's Changed
- feat: implement chat feature to rwkv by @yorkzero831 in #63
- feat: update cuda dynamic compiling by @hlhr202 in #66
- feature: optimiza llama.cpp loading, fix llama.cpp tokenizer, unify logger by @hlhr202 in #75
- update: refractor onnx by @hlhr202 in #87
- update: upgrade llm to 0.2.0-dev by @fardjad in #86
New Contributors
Full Changelog: v0.1.4...v0.1.6