From: Mathieu Baudier Date: Tue, 21 Jan 2025 12:31:47 +0000 (+0100) Subject: Make sure that at least the CPU backend is available for utilities X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=426c1ef7df4104b071a386968bc7125f46b231d0;p=pkg%2Fggml%2Fsources%2Fllama.cpp Make sure that at least the CPU backend is available for utilities --- diff --git a/debian/control b/debian/control index 919503e3..a6728836 100644 --- a/debian/control +++ b/debian/control @@ -22,7 +22,7 @@ Package: llama-cpp-cli Architecture: any Priority: optional Depends: ${misc:Depends}, ${shlibs:Depends}, - libllama, curl + libllama, ggml, curl Description: Inference of LLMs in pure C/C++ (CLI) Llama.cpp inference of LLMs in pure C/C++ (CLI). @@ -30,7 +30,7 @@ Package: llama-cpp-server Architecture: any Priority: optional Depends: ${misc:Depends}, ${shlibs:Depends}, - libllama, curl, openssl + libllama, ggml, curl, openssl Description: Inference of LLMs in pure C/C++ (CLI) Llama.cpp inference of LLMs in pure C/C++ (CLI).