You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
llama.cpp/examples
eiery 10f19c1121
llama : have n_batch default to 512 (#1091)
* set default n_batch to 512 when using BLAS

* spacing

* alternate implementation of setting different n_batch for BLAS

* set n_batch to 512 for all cases
1 year ago
..
benchmark benchmark : fix result validation in benchmark-q4_0-matmult (#987) 1 year ago
embedding examples: add missing <ctime> include for time() (#1011) 1 year ago
main main : evaluate tokens in batches after swapping context (#1014) 1 year ago
perplexity Show perplexity ETA in hours and minutes (#1096) 1 year ago
quantize llama : multi-threaded quantization (#1075) 1 year ago
quantize-stats llama : multi-threaded quantization (#1075) 1 year ago
CMakeLists.txt Add quantize-stats command for testing quantization (#728) 1 year ago
Miku.sh Fix whitespace, add .editorconfig, add GitHub workflow (#883) 1 year ago
alpaca.sh examples : Improve Alpaca Default Repeat Penalty: Better Match Alpaca.cpp Experience (#1107) 1 year ago
chat-13B.bat Create chat-13B.bat (#592) 1 year ago
chat-13B.sh Move chat scripts into "./examples" 1 year ago
chat.sh If n_predict == -1, generate forever 1 year ago
common.cpp Add LoRA support (#820) 1 year ago
common.h llama : have n_batch default to 512 (#1091) 1 year ago
gpt4all.sh examples : add -n to alpaca and gpt4all scripts (#706) 1 year ago
reason-act.sh add example of re-act pattern (#583) 1 year ago