]> git.djapps.eu Git - pkg/ggml/sources/llama.cpp/commit
common: custom hf endpoint support (#12769)
authorエシュナヴァリシア <redacted>
Sat, 5 Apr 2025 13:31:42 +0000 (21:31 +0800)
committerGitHub <redacted>
Sat, 5 Apr 2025 13:31:42 +0000 (15:31 +0200)
commitc6ff5d2a8da2587cc78c9ede9171dfc3f076c757
tree1b9313773c91af28d01d2467f7d5f084bb9443c6
parent7a84777f42a9b3ba47db5d20b7662f8ddf92f652
common: custom hf endpoint support (#12769)

* common: custom hf endpoint support

Add support for custom huggingface endpoints via HF_ENDPOINT environment variable

You can now specify a custom huggingface endpoint using the HF_ENDPOINT environment variable when using the --hf-repo flag, which works similarly to huggingface-cli's endpoint configuration.

Example usage:
HF_ENDPOINT=https://hf-mirror.com/ ./bin/llama-cli --hf-repo Qwen/Qwen1.5-0.5B-Chat-GGUF --hf-file qwen1_5-0_5b-chat-q2_k.gguf -p "The meaning to life and the universe is"

The trailing slash in the URL is optional:
HF_ENDPOINT=https://hf-mirror.com ./bin/llama-cli --hf-repo Qwen/Qwen1.5-0.5B-Chat-GGUF --hf-file qwen1_5-0_5b-chat-q2_k.gguf -p "The meaning to life and the universe is"

* Update common/arg.cpp

readability Improvement

Co-authored-by: Xuan-Son Nguyen <redacted>
* Apply suggestions from code review

---------

Co-authored-by: ベアトリーチェ <redacted>
Co-authored-by: Xuan-Son Nguyen <redacted>
common/arg.cpp