]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
server : fix first message identification (#13634)
authorDorin-Andrei Geman <redacted>
Wed, 21 May 2025 13:07:57 +0000 (16:07 +0300)
committerGitHub <redacted>
Wed, 21 May 2025 13:07:57 +0000 (15:07 +0200)
commit42158ae2e8ead667a83f07247321ce85f32ace66
treee475f838240f8a80902d67322b1baa8251106044
parent797f2ac0625b22edeff03cc30e0f988da6b6b068
server : fix first message identification (#13634)

* server : fix first message identification

When using the OpenAI SDK (https://github.com/openai/openai-node/blob/master/src/lib/ChatCompletionStream.ts#L623-L626) we noticed that the expected assistant role is missing in the first streaming message. Fix this by correctly checking for the first message.

Co-authored-by: Piotr Stankiewicz <redacted>
Signed-off-by: Dorin Geman <redacted>
* server : Fix checks for first role message for stream=True

Co-authored-by: Piotr Stankiewicz <redacted>
Signed-off-by: Dorin Geman <redacted>
---------

Signed-off-by: Dorin Geman <redacted>
Co-authored-by: Piotr Stankiewicz <redacted>
tools/server/server.cpp
tools/server/tests/unit/test_chat_completion.py