This commit removes the 'd' from the log message in llama-vocab.cpp
when logging a bad special token.
The motivation for this is that currently the output can look something
like the following:
```console
load: bad special token:
'tokenizer.ggml.image_token_id' =
128256d, using default id -1
```
continue;
}
if (new_id >= id_to_token.size()) {
- LLAMA_LOG_WARN("%s: bad special token: '%s' = %ud, using default id %d\n",
+ LLAMA_LOG_WARN("%s: bad special token: '%s' = %u, using default id %d\n",
__func__, key.c_str(), new_id, id);
} else {
id = new_id;