4
Does DeepSeek* Solve the Small Scale Model Performance Puzzle?
(community.intel.com)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
I tested out a Deepseek model the other day. It took one minute to generate text and used up all my context space in one message. Local consumer models and "small" server hosted models are probably different classes because for my home pc it was a big performance downgrade.