494
submitted 3 months ago by lidd1ejimmy@lemmy.ml to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] neidu2@feddit.nl 47 points 3 months ago* (last edited 3 months ago)

Technically possible with a small enough model to work from. It's going to be pretty shit, but "working".

Now, if we were to go further down in scale, I'm curious how/if a 700MB CD version would work.

Or how many 1.44MB floppies you would need for the actual program and smallest viable model.

[-] Naz@sh.itjust.works 16 points 3 months ago

squints

That says , "PHILLIPS DVD+R"

So we're looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>

[-] curbstickle@lemmy.dbzer0.com 13 points 3 months ago

llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.

[-] BudgetBandit@sh.itjust.works 1 points 3 months ago

Just interested in the topic did you 🔨 offline privately?

[-] curbstickle@lemmy.dbzer0.com 1 points 3 months ago

I'm not an expert on them or anything, but feel free

load more comments (1 replies)
load more comments (12 replies)
this post was submitted on 18 Jul 2024
494 points (98.6% liked)

Memes

45450 readers
2483 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS