Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

v0.1.6

Pre-release
Pre-release
Compare
Choose a tag to compare
@hlhr202 hlhr202 released this 29 May 13:33
· 3 commits to main since this release

What's Changed

  • feat: implement chat feature to rwkv by @yorkzero831 in #63
  • feat: update cuda dynamic compiling by @hlhr202 in #66
  • feature: optimiza llama.cpp loading, fix llama.cpp tokenizer, unify logger by @hlhr202 in #75
  • update: refractor onnx by @hlhr202 in #87
  • update: upgrade llm to 0.2.0-dev by @fardjad in #86

New Contributors

Full Changelog: v0.1.4...v0.1.6