From: Eric Curtin Date: Thu, 4 Sep 2025 09:49:44 +0000 (+0100) Subject: Document the new max GPU layers default in help (#15771) X-Git-Tag: upstream/0.0.6527~148 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=badb80cadbc40e047b30c43611aba575fc8d6845;p=pkg%2Fggml%2Fsources%2Fllama.cpp Document the new max GPU layers default in help (#15771) This is a key change, just letting users know. Signed-off-by: Eric Curtin --- diff --git a/common/arg.cpp b/common/arg.cpp index fcee0c44..7507c811 100644 --- a/common/arg.cpp +++ b/common/arg.cpp @@ -2466,7 +2466,7 @@ common_params_context common_params_parser_init(common_params & params, llama_ex ).set_examples({LLAMA_EXAMPLE_SPECULATIVE, LLAMA_EXAMPLE_SERVER}).set_env("LLAMA_ARG_N_CPU_MOE_DRAFT")); add_opt(common_arg( {"-ngl", "--gpu-layers", "--n-gpu-layers"}, "N", - "number of layers to store in VRAM", + string_format("max. number of layers to store in VRAM (default: %d)", params.n_gpu_layers), [](common_params & params, int value) { params.n_gpu_layers = value; if (!llama_supports_gpu_offload()) {