From: Georgi Gerganov Date: Mon, 18 Aug 2025 15:11:44 +0000 (+0300) Subject: readme : update hot topics (#15397) X-Git-Tag: upstream/0.0.6199~5 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=3007baf201e7ffcda17dbdb0335997fa50a6595b;p=pkg%2Fggml%2Fsources%2Fllama.cpp readme : update hot topics (#15397) --- diff --git a/README.md b/README.md index 11d92907..84467563 100644 --- a/README.md +++ b/README.md @@ -17,6 +17,7 @@ LLM inference in C/C++ ## Hot topics +- **[guide : running gpt-oss with llama.cpp](https://github.com/ggml-org/llama.cpp/discussions/15396)** - **[[FEEDBACK] Better packaging for llama.cpp to support downstream consumers 🤗](https://github.com/ggml-org/llama.cpp/discussions/15313)** - Support for the `gpt-oss` model with native MXFP4 format has been added | [PR](https://github.com/ggml-org/llama.cpp/pull/15091) | [Collaboration with NVIDIA](https://blogs.nvidia.com/blog/rtx-ai-garage-openai-oss) | [Comment](https://github.com/ggml-org/llama.cpp/discussions/15095) - Hot PRs: [All](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Apr+label%3Ahot+) | [Open](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Apr+label%3Ahot+is%3Aopen)