Skip to content

Releases: OuadiElfarouki/llama.cpp

b2966

22 May 14:40
fcda112
Compare
Choose a tag to compare
vulkan: add workaround for iterator boundary check to fix clang-cl de…

b2953

21 May 00:49
917dc8c
Compare
Choose a tag to compare
Tokenizer SPM fixes for phi-3 and llama-spm (#7375)

* Update brute force test: special tokens
* Fix added tokens
  - Try to read 'added_tokens.json'.
  - Try to read 'tokenizer_config.json'.
  - Try to read 'tokenizer.json'.
* Fix special tokens rtrim

Co-authored-by: Georgi Gerganov <[email protected]>
* server : fix test regexes

b2843

10 May 17:00
e849648
Compare
Choose a tag to compare
llama-bench : add pp+tg test type (#7199)

b2825

09 May 09:19
07cd41d
Compare
Choose a tag to compare
TypoFix (#7162)

b2817

08 May 17:05
83330d8
Compare
Choose a tag to compare
main : add --conversation / -cnv flag (#7108)

b2806

08 May 05:43
c780e75
Compare
Choose a tag to compare
Further tidy on Android instructions README.md (#7077)

* Further tidy on Android instructions README.md

Fixed some logic when following readme direction

* Clean up redundent information

A new user arriving will see simple directions on llama.cpp homepage

* corrected puncuation

Period after cmake, colon after termux

* re-word for clarity

method seems to be more correct, instead of alternative in this context

* Organized required packages per build type

building llama.cpp with NDK on a pc doesn't require installing clang, cmake, git, or wget in termux.

* README.md

corrected title

* fix trailing whitespace

b2751

28 Apr 06:06
4dba7e8
Compare
Choose a tag to compare
Replace "alternative" boolean operator in conditional compilation dir…

b2634

08 Apr 20:29
cc4a954
Compare
Choose a tag to compare
llama : fix attention layer count sanity check (#6550)

* llama : fix attention layer count sanity check

* llama : fix parentheses in attention layer count sanity check

There was otherwise a warning when compiling.

---------

Co-authored-by: Francis Couture-Harpin <[email protected]>

b2624

07 Apr 16:34
c372477
Compare
Choose a tag to compare
sync : ggml

b2589

03 Apr 16:24
1ff4d9f
Compare
Choose a tag to compare
Add OpenChat, Alpaca, Vicuna chat templates (#6397)

* Add openchat chat template

* Add chat template test for openchat

* Add chat template for vicuna

* Add chat template for orca-vicuna

* Add EOS for vicuna templates

* Combine vicuna chat templates

* Add tests for openchat and vicuna chat templates

* Add chat template for alpaca

* Add separate template name for vicuna-orca

* Remove alpaca, match deepseek with jinja output

* Regenerate chat template test with add_generation_prompt

* Separate deepseek bos from system message

* Match openchat template with jinja output

* Remove BOS token from templates, unprefix openchat