]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
model: support Ministral3 (#17644)
authorXuan-Son Nguyen <redacted>
Mon, 1 Dec 2025 11:26:52 +0000 (12:26 +0100)
committerGitHub <redacted>
Mon, 1 Dec 2025 11:26:52 +0000 (12:26 +0100)
commitcd3c1189082e9ecca172b5bea2442606f489f439
tree54c0960f9af4558b8f7a057c644777eec76baa0b
parent649495c9d915a284aeec5bca5d0efaa6d1bc7c87
model: support Ministral3 (#17644)

* conversion script

* support ministral 3

* maybe this is better?

* add TODO for rope_yarn_log_mul

* better ppl (tested on 14B-Instruct)

* Add Ministral3 support to Mistral format

* improve arch handling

* add sizes

* Apply suggestions from code review

Co-authored-by: Sigbjørn Skjæret <redacted>
* nits

---------

Co-authored-by: Julien Denize <redacted>
Co-authored-by: Sigbjørn Skjæret <redacted>
convert_hf_to_gguf.py
gguf-py/gguf/constants.py
gguf-py/gguf/gguf_writer.py
src/CMakeLists.txt
src/llama-arch.cpp
src/llama-arch.h
src/llama-graph.cpp
src/llama-hparams.h
src/llama-model.cpp
src/models/mistral3.cpp [new file with mode: 0644]
src/models/models.h