Skip to content

Releases: ngxson/llama.cpp

b3202

22 Jun 18:57
3e58b0e
Compare
Choose a tag to compare
cvector: fix CI + correct help message (#8064)

* cvector: fix CI + correct help message

* also correct --pca-iter

b3201

22 Jun 16:21
adf480c
Compare
Choose a tag to compare
cvector-generator: Moe Moe Fixie-Fixie for Lots of Formats~! ♡(ᐢ ᴥ ᐢ)…

b3183

19 Jun 12:29
a04a953
Compare
Choose a tag to compare
codecov : remove (#8004)

b3040

29 May 22:53
55d6226
Compare
Choose a tag to compare
metal : remove invalid asserts (#7617)

b2986

23 May 21:11
74f33ad
Compare
Choose a tag to compare
readme : remove trailing space (#7469)

b2879

14 May 16:11
4f02636
Compare
Choose a tag to compare
server: free sampling contexts on exit (#7264)

* server: free sampling contexts on exit

This cleans up last leak found by the address sanitizer.

* fix whitespace

* fix whitespace

b2821

08 May 20:25
c12452c
Compare
Choose a tag to compare
JSON: [key] -> .at(key), assert() -> GGML_ASSERT (#7143)

b2809

08 May 09:53
acdce3c
Compare
Choose a tag to compare
compare-llama-bench.py: add missing basicConfig (#7138)

* compare-llama-bench.py: add missing basicConfig

* compare-llama-bench.py: Add line break between error message and print_help()

* Add regular print() markdown table

b2786

04 May 13:09
03fb8a0
Compare
Choose a tag to compare
If first token generated from the server is the stop word the server …

b2724

24 Apr 17:06
b4e4b8a
Compare
Choose a tag to compare
llama : add llama_get_pooling_type function (#6862)

* add llama_get_pooling_type function

* fix argument name, move with ctx funcs