1083

As the AI market continues to balloon, experts are warning that its VC-driven rise is eerily similar to that of the dot com bubble.

you are viewing a single comment's thread
view the rest of the comments
[-] pexavc@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Oh wow, that's good to know. I always attributed visual graphics to be way more intensive. Wouldn't think a text generative model to take up that much Vram

Edit: how many parameters did you test with?

[-] ramblinguy@sh.itjust.works 1 points 1 year ago

Sorry, just seeing this now- I think with 24gb of vram, the most you can get is a 4bit quantized 30b model, and even then, I think you'd have to limit it to 2-3k of context. Here's a chart for size comparisons: https://postimg.cc/4mxcM3kX

By comparison, with 24gb of vram, I only use half of that to create a batch of 8 768x576 photos. I also sub to mage.space, and I'm pretty sure they're able to handle all of their volume on an A100 and A10G

this post was submitted on 13 Aug 2023
1083 points (96.2% liked)

Technology

59583 readers
2417 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS