370
submitted 2 days ago by JOMusic@lemmy.ml to c/opensource@lemmy.ml

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

you are viewing a single comment's thread
view the rest of the comments
[-] simple@lemm.ee 16 points 2 days ago* (last edited 2 days ago)

I understand it well. It's still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn't crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.

Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.

[-] superglue@lemmy.dbzer0.com 2 points 2 days ago

So I've been interested in running one locally but honestly I'm pretty confused what model I should be using. I have a laptop with a 3070 mobile in it. What model should I be going after?

this post was submitted on 31 Jan 2025
370 points (94.7% liked)

Open Source

32388 readers
476 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS