432
submitted 2 days ago* (last edited 2 days ago) by JOMusic@lemmy.ml to c/opensource@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] domi@lemmy.secnd.me 5 points 2 days ago

Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn't make them obsolete since most people and many companies couldn't host it either way.

[-] rcbrk@lemmy.ml 2 points 1 day ago* (last edited 1 day ago)

Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?

[-] domi@lemmy.secnd.me 1 points 1 day ago

Anyone that has 300.000$ per instance, the know-how to set it up, the means to support it and can outbid OpenAI, yes.

I don't see that happening on a large scale, just like I don't see tons of DeepSeek instances being hosted cheaper than the original any time soon.

If they really are afraid of that they can always license it in a way that forbids reselling.

this post was submitted on 01 Feb 2025
432 points (97.8% liked)

Open Source

32388 readers
437 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS