|
|
@ -7,6 +7,7 @@ Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++
|
|
|
|
|
|
|
|
|
|
|
|
**Hot topics:**
|
|
|
|
**Hot topics:**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
- RMSNorm implementation / fixes: https://github.com/ggerganov/llama.cpp/issues/173
|
|
|
|
- Cache input prompts for faster initialization: https://github.com/ggerganov/llama.cpp/issues/64
|
|
|
|
- Cache input prompts for faster initialization: https://github.com/ggerganov/llama.cpp/issues/64
|
|
|
|
- Create a `llama.cpp` logo: https://github.com/ggerganov/llama.cpp/issues/105
|
|
|
|
- Create a `llama.cpp` logo: https://github.com/ggerganov/llama.cpp/issues/105
|
|
|
|
|
|
|
|
|
|
|
|