From: Daniel Bevenius Date: Mon, 24 Nov 2025 13:38:45 +0000 (+0100) Subject: examples : add -kvu to batched usage example [no ci] (#17469) X-Git-Tag: upstream/0.0.7446~301 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=6ab8eacddf50cda653b1e27521bd88945c41df1b;p=pkg%2Fggml%2Fsources%2Fllama.cpp examples : add -kvu to batched usage example [no ci] (#17469) This commit adds the --kv-unified flag to the usage example in the README.md file for the batched example. The motivation for this is that without this flag the example will fail with the following error: ```console Hello my name is split_equal: sequential split is not supported when there are coupled sequences in the input batch (you may need to use the -kvu flag) decode: failed to find a memory slot for batch of size 4 main: llama_decode() failed ``` --- diff --git a/examples/batched/README.md b/examples/batched/README.md index 6013aab0..8cde35dd 100644 --- a/examples/batched/README.md +++ b/examples/batched/README.md @@ -3,7 +3,7 @@ The example demonstrates batched generation from a given prompt ```bash -./llama-batched -m ./models/llama-7b-v2/ggml-model-f16.gguf -p "Hello my name is" -np 4 +./llama-batched -m ./models/llama-7b-v2/ggml-model-f16.gguf -p "Hello my name is" -np 4 --kv-unified ...