29
Can you self-host AI at parity with chatgpt?
(lemmy.ml)
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
Looking for support?
Looking for a community?
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
you did not specify what type of model your trying to run. like deepseekr1 has various models if your trying to run the massive ones its not gonna work. You need to use a smaller model. I have a RX 6600 and run the 14b parameter model it does well.
to be clear btw your CPU basically doesnt matter as far as i know. Just the GPU should be getting used any old CPU works. You CAN run it on a CPU but its gonna be very slow. But yeah the RX 6600 was decently cheap i got it for like 150$ so its not super expensive to run one of these models.