12
unsure on how to quantize model
(feddit.it)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
I think i may try this way if kobold uses vulkan instead of rocm, It's most likely going to be way less of a headache.
As for the model, it's what came out of a random search for a decent small model on reddit. No reason in particular, thanks for the suggestion.