16
Did DeepSeek R1 just pop nvidias bubble?
(www.youtube.com)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
Hard to tell at this stage but models may get a hell of a lot bigger if the hardware required to train them is much smaller or it may plateau for a while while everyone else works on training their stuff using significantly less power and hardware