36
you are viewing a single comment's thread
view the rest of the comments
[-] webghost0101@sopuli.xyz 2 points 1 year ago

12GB of VRAM is still an upgrade away for most people and a 4bit quantized 13B model is barely going to be a tech demo. When open source ai is proclaimed to be near/on par/better then gpt4 they are talking about nothing else than their biggest models in a prime environment.

[-] just_another_person@lemmy.world 1 points 1 year ago

Sure, but not for standard cloud instances that are very affordable for companies wanting to get away from OpenAI.

[-] webghost0101@sopuli.xyz 1 points 1 year ago

I usually don’t think much about companies and cloud instances when it comes to Fossai but fair enough.

For me its all about locally run consumer models. If we cannot archive that it means we will always need to rely on the wims and decisions of others to acces the most transforming technology ever invented.

this post was submitted on 09 Oct 2023
36 points (100.0% liked)

Futurology

1731 readers
143 users here now

founded 1 year ago
MODERATORS