26
What have you been up to recently with your local LLMs?
(discuss.tchncs.de)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
Thank you for pointing that out. I was completely unaware of microsoft guidance. Once they merge/implement llama.cpp support, i'm definitely going to try it, too.
That will certainly be amazing, but for now it's actually not bad to use either oobabooga web UI or koboldcpp to run the inferencing and provide a rest endpoint, cause you can trick basically any program into treating it as if it's OpenAI and use it the same way