From: Ziang Wu Date: Wed, 20 Mar 2024 15:29:51 +0000 (+0800) Subject: llava : update MobileVLM-README.md (#6180) X-Git-Tag: upstream/0.0.4488~2010 X-Git-Url: https://git.djapps.eu/?a=commitdiff_plain;h=f9c7ba34476ffc4f13ae2cdb1aec493a16eb8d47;p=pkg%2Fggml%2Fsources%2Fllama.cpp llava : update MobileVLM-README.md (#6180) --- diff --git a/examples/llava/MobileVLM-README.md b/examples/llava/MobileVLM-README.md index c1f361d1..4d5fef02 100644 --- a/examples/llava/MobileVLM-README.md +++ b/examples/llava/MobileVLM-README.md @@ -6,7 +6,7 @@ for more information, please go to [Meituan-AutoML/MobileVLM](https://github.com The implementation is based on llava, and is compatible with llava and mobileVLM. The usage is basically same as llava. -Notice: The overall process of model inference for both **MobilVLM** and **MobilVLM_V2** models is the same, but the process of model conversion is a little different. Therefore, using MobiVLM as an example, the different conversion step will be shown. +Notice: The overall process of model inference for both **MobileVLM** and **MobileVLM_V2** models is the same, but the process of model conversion is a little different. Therefore, using MobiVLM as an example, the different conversion step will be shown. ## Usage Build with cmake or run `make llava-cli` to build it.