From: Mickael Desgranges Date: Tue, 3 Mar 2026 13:50:00 +0000 (+0100) Subject: docs: Fix intel documentation link (#20040) X-Git-Tag: upstream/0.0.8611~418 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=ecd99d6a9acbc436bad085783bcd5d0b9ae9e9e9;p=pkg%2Fggml%2Fsources%2Fllama.cpp docs: Fix intel documentation link (#20040) --- diff --git a/docs/build.md b/docs/build.md index fd447424c..e6f572c77 100644 --- a/docs/build.md +++ b/docs/build.md @@ -108,7 +108,7 @@ Building through oneAPI compilers will make avx_vnni instruction set available f - Using oneAPI docker image: If you do not want to source the environment vars and install oneAPI manually, you can also build the code using intel docker container: [oneAPI-basekit](https://hub.docker.com/r/intel/oneapi-basekit). Then, you can use the commands given above. -Check [Optimizing and Running LLaMA2 on Intel® CPU](https://www.intel.com/content/www/us/en/content-details/791610/optimizing-and-running-llama2-on-intel-cpu.html) for more information. +Check [Optimizing and Running LLaMA2 on Intel® CPU](https://builders.intel.com/solutionslibrary/optimizing-and-running-llama2-on-intel-cpu) for more information. ### Other BLAS libraries