338
you are viewing a single comment's thread
view the rest of the comments
[-] SillySausage@lemmynsfw.com 10 points 1 month ago

I successfully ran local Llama with llama.cpp and an old AMD GPU. I'm not sure why you think there's no other option.

this post was submitted on 26 Dec 2025
338 points (99.1% liked)

Linux

11486 readers
421 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS