* add LLMUnity to UI projects
* add newline to examples/rpc/README.md to fix editorconfig-checker unit test
- [AI Sublime Text plugin](https://github.com/yaroslavyaroslav/OpenAI-sublime-text) (MIT)
- [AIKit](https://github.com/sozercan/aikit) (MIT)
- [LARS - The LLM & Advanced Referencing Solution](https://github.com/abgulati/LARS) (AGPL)
+- [LLMUnity](https://github.com/undreamai/LLMUnity) (MIT)
*(to have a project listed here, it should clearly state that it depends on `llama.cpp`)*
$ bin/llama-cli -m ../models/tinyllama-1b/ggml-model-f16.gguf -p "Hello, my name is" --repeat-penalty 1.0 -n 64 --rpc 192.168.88.10:50052,192.168.88.11:50052 -ngl 99
```
-This way you can offload model layers to both local and remote devices.
\ No newline at end of file
+This way you can offload model layers to both local and remote devices.
+