From: 0xsourcecode Date: Wed, 24 May 2023 08:23:51 +0000 (-0400) Subject: readme : highlight OpenBLAS support (#956) X-Git-Tag: upstream/1.7.4~1424 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=4e16a8fb63406582b5b57aae1c6a99d9c2bf4475;p=pkg%2Fggml%2Fsources%2Fwhisper.cpp readme : highlight OpenBLAS support (#956) * highlight openblas support * Update README.md --- diff --git a/README.md b/README.md index 7d76f900..3d4b300d 100644 --- a/README.md +++ b/README.md @@ -21,6 +21,7 @@ High-performance inference of [OpenAI's Whisper](https://github.com/openai/whisp - Runs on the CPU - [Partial GPU support for NVIDIA via cuBLAS](https://github.com/ggerganov/whisper.cpp#nvidia-gpu-support-via-cublas) - [Partial OpenCL GPU support via CLBlast](https://github.com/ggerganov/whisper.cpp#opencl-gpu-support-via-clblast) +- [BLAS CPU support via OpenBLAS]((https://github.com/ggerganov/whisper.cpp#blas-cpu-support-via-openblas) - [C-style API](https://github.com/ggerganov/whisper.cpp/blob/master/whisper.h) Supported platforms: @@ -346,6 +347,18 @@ cp bin/* ../ Run all the examples as usual. +## BLAS CPU support via OpenBLAS + +Encoder processing can be accelerated on the CPU via OpenBLAS. +First, make sure you have installed `openblas`: https://www.openblas.net/ + +Now build `whisper.cpp` with OpenBLAS support: + +``` +make clean +WHISPER_OPENBLAS=1 make -j +``` + ## Limitations - Inference only