5
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 10 Apr 2026
5 points (66.7% liked)
Linux
13366 readers
21 users here now
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
founded 2 years ago
MODERATORS
The AI features are actually pretty cool!
Using a local AI model (running on your own GPU), you can:
It supports voice and probably visual (e.g. with a webcam) I think.
Best of all: It doesn't send any of that to some data center in the cloud! I mean, you can configure it to do that but you can just as easily use say, qwen3.5.
Note: It's not realistic to expect to be able to use local models if you have less than 16GB of VRAM (in your GPU). I mean, some 8-billion parameter model will work with say, 8GB but you're not going to be satisfied with the results most of the time 🤷