]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
vulkan: support arbitrary KV dimension in flash attention (#16160)
authorJeff Bolz <redacted>
Sat, 27 Sep 2025 20:43:39 +0000 (16:43 -0400)
committerGitHub <redacted>
Sat, 27 Sep 2025 20:43:39 +0000 (22:43 +0200)
commite6d65fb02d553bd79cad94e517cdca18b687788d
treeaec5288ff33fc53f950dfbbb00c4bc58cae15788
parent8656f5de688cddcaea1d6174535eb60ee23ef6a0
vulkan: support arbitrary KV dimension in flash attention (#16160)

The "Clamp" spec constant is already based on whether KV is a multiple of Bc,
so use that to control whether bounds checking is performed. Add bounds checking
to the scalar and coopmat1 paths. Coopmat2 didn't need any changes (the K/V
tensors are already optionally clamped, nothing else needed to be changed).
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn.comp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn_base.comp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn_cm1.comp