]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
vulkan: fix fp16 Flash Attention on Windows AMD RDNA2 and below (#19921)
authorRuben Ortlam <redacted>
Thu, 26 Feb 2026 18:11:04 +0000 (19:11 +0100)
committerGitHub <redacted>
Thu, 26 Feb 2026 18:11:04 +0000 (19:11 +0100)
commit723c71064da0908c19683f8c344715fbf6d986fd
tree1a2532768b5c8f650ebc03b926c7ef5263b03548
parent37964f44f9fab37571b27cccd9f45d4a066e0817
vulkan: fix fp16 Flash Attention on Windows AMD RDNA2 and below (#19921)
ggml/src/ggml-vulkan/ggml-vulkan.cpp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn.comp
ggml/src/ggml-vulkan/vulkan-shaders/flash_attn_base.glsl