From: Riceball LEE Date: Mon, 23 Sep 2024 15:58:17 +0000 (+0800) Subject: readme : add programmable prompt engine language CLI (#9599) X-Git-Tag: upstream/0.0.4488~677 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=1d48e98e4f3316bd2f6b187d288c7b6cb88d5cb3;p=pkg%2Fggml%2Fsources%2Fllama.cpp readme : add programmable prompt engine language CLI (#9599) --- diff --git a/README.md b/README.md index 4d24dd59..ce954f71 100644 --- a/README.md +++ b/README.md @@ -112,6 +112,7 @@ Typically finetunes of the base models below are supported as well. - Go: [go-skynet/go-llama.cpp](https://github.com/go-skynet/go-llama.cpp) - Node.js: [withcatai/node-llama-cpp](https://github.com/withcatai/node-llama-cpp) - JS/TS (llama.cpp server client): [lgrammel/modelfusion](https://modelfusion.dev/integration/model-provider/llamacpp) +- JS/TS (Programmable Prompt Engine CLI): [offline-ai/cli](https://github.com/offline-ai/cli) - JavaScript/Wasm (works in browser): [tangledgroup/llama-cpp-wasm](https://github.com/tangledgroup/llama-cpp-wasm) - Typescript/Wasm (nicer API, available on npm): [ngxson/wllama](https://github.com/ngxson/wllama) - Ruby: [yoshoku/llama_cpp.rb](https://github.com/yoshoku/llama_cpp.rb)