]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
Added api for getting/setting the kv_cache (#685)
authorChristian Falch <redacted>
Sun, 2 Apr 2023 10:23:04 +0000 (12:23 +0200)
committerGitHub <redacted>
Sun, 2 Apr 2023 10:23:04 +0000 (12:23 +0200)
commite986f94829bae0b9e66b326acbbba179931c84f1
tree2bfe56177c5a08f4cf46c8174925f61bd82992cc
parentc0bb1d3ce21005ab21d686626ba87261a6e3a660
Added api for getting/setting the kv_cache (#685)

The api provides access methods for retrieving the current memory buffer for the kv_cache and its token number.
It also contains a method for setting the kv_cache from a memory buffer.

This makes it possible to load/save history - maybe support --cache-prompt paramater as well?

Co-authored-by: Pavol Rusnak <redacted>
llama.cpp
llama.h