339
you are viewing a single comment's thread
view the rest of the comments
[-] SillySausage@lemmynsfw.com 10 points 3 months ago

I successfully ran local Llama with llama.cpp and an old AMD GPU. I'm not sure why you think there's no other option.

this post was submitted on 26 Dec 2025
339 points (99.1% liked)

Linux

13028 readers
704 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS