]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
webui: updated the chat service to only include max_tokens in the req… (#16489)
authorPascal <redacted>
Thu, 9 Oct 2025 20:54:57 +0000 (22:54 +0200)
committerGitHub <redacted>
Thu, 9 Oct 2025 20:54:57 +0000 (22:54 +0200)
commit1faa13a1187051af66b0fd9f0d6effe4c77f0b3e
treead6138a25910dc11f19e4cceb3260045dae685cb
parent1deee0f8d494981c32597dca8b5f8696d399b0f2
webui: updated the chat service to only include max_tokens in the req… (#16489)

* webui: updated the chat service to only include max_tokens in the request payload when the setting is explicitly provided, while still mapping explicit zero or null values to the infinite-token sentinel

* chore: update webui build output
tools/server/public/index.html.gz
tools/server/webui/src/lib/services/chat.ts