]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
Simple webchat for server (#1998)
authorTobias Lütke <redacted>
Tue, 4 Jul 2023 14:05:27 +0000 (10:05 -0400)
committerGitHub <redacted>
Tue, 4 Jul 2023 14:05:27 +0000 (16:05 +0200)
commit7ee76e45afae7f9a7a53e93393accfb5b36684e1
tree113fc52e6b154de3d1b1c39fbcae1ab002f884e6
parentacc111caf93fc6681450924df9f99679c384c59e
Simple webchat for server (#1998)

* expose simple web interface on root domain

* embed index and add --path for choosing static dir

* allow server to multithread

because web browsers send a lot of garbage requests we want the server
to multithread when serving 404s for favicon's etc. To avoid blowing up
llama we just take a mutex when it's invoked.

* let's try this with the xxd tool instead and see if msvc is happier with that

* enable server in Makefiles

* add /completion.js file to make it easy to use the server from js

* slightly nicer css

* rework state management into session, expose historyTemplate to settings

---------

Co-authored-by: Georgi Gerganov <redacted>
CMakeLists.txt
examples/server/completion.js.hpp [new file with mode: 0644]
examples/server/deps.sh [new file with mode: 0755]
examples/server/index.html.hpp [new file with mode: 0644]
examples/server/index.js.hpp [new file with mode: 0644]
examples/server/public/completion.js [new file with mode: 0644]
examples/server/public/index.html [new file with mode: 0644]
examples/server/public/index.js [new file with mode: 0644]
examples/server/server.cpp