Releases: ngxson/llama.cpp
Releases · ngxson/llama.cpp
b3202
cvector: fix CI + correct help message (#8064) * cvector: fix CI + correct help message * also correct --pca-iter
b3201
cvector-generator: Moe Moe Fixie-Fixie for Lots of Formats~! ♡(ᐢ ᴥ ᐢ)…
b3183
codecov : remove (#8004)
b3040
metal : remove invalid asserts (#7617)
b2986
readme : remove trailing space (#7469)
b2879
server: free sampling contexts on exit (#7264) * server: free sampling contexts on exit This cleans up last leak found by the address sanitizer. * fix whitespace * fix whitespace
b2821
JSON: [key] -> .at(key), assert() -> GGML_ASSERT (#7143)
b2809
compare-llama-bench.py: add missing basicConfig (#7138) * compare-llama-bench.py: add missing basicConfig * compare-llama-bench.py: Add line break between error message and print_help() * Add regular print() markdown table
b2786
If first token generated from the server is the stop word the server …
b2724
llama : add llama_get_pooling_type function (#6862) * add llama_get_pooling_type function * fix argument name, move with ctx funcs