]> git.djapps.eu Git - pkg/ggml/sources/whisper.cpp/commit
vulkan: support arbitrary KV dimension in flash attention (llama/16160)
authorJeff Bolz <redacted>
Sat, 27 Sep 2025 20:43:39 +0000 (16:43 -0400)
committerGeorgi Gerganov <redacted>
Mon, 29 Sep 2025 12:18:12 +0000 (15:18 +0300)
commiteb982dd786f5be809dbde762eb54f74d52f070d0
tree3aca40a63a5b597613d5dae694abb42e5bd21415
parentbc1ac13c2f3f8dca4fb4b0b6071f86766ff247dc
vulkan: support arbitrary KV dimension in flash attention (llama/16160)

The "Clamp" spec constant is already based on whether KV is a multiple of Bc,
so use that to control whether bounds checking is performed. Add bounds checking
to the scalar and coopmat1 paths. Coopmat2 didn't need any changes (the K/V
tensors are already optionally clamped, nothing else needed to be changed).
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn.comp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn_base.comp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn_cm1.comp