253
you are viewing a single comment's thread
view the rest of the comments
[-] gandalf_der_12te@lemmy.blahaj.zone 1 points 16 hours ago

actually i think it kinda is nice and easy to do, i'm just too lazy/cheap to rent a server with 8GB of RAM, even though it would only cost $15/month or sth.

[-] Smorty@lemmy.blahaj.zone 2 points 15 hours ago

it would also be super slow, u usually want a GPU for LLM inference.. but u already know this, u are Gandald der zwölfte after all <3

this post was submitted on 05 Feb 2025
253 points (98.8% liked)

196

16887 readers
1200 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS