501
submitted 10 months ago by throws_lemy@lemmy.nz to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] BlastboomStrice@mander.xyz 58 points 10 months ago* (last edited 10 months ago)

[Edit: indeed, its actually good that it's 2gb]

2gb plugin??!

Btw, does it work with tenacity?

[-] 9point6@lemmy.world 64 points 10 months ago

AI models are often multiple gigabytes, tbh it's a good sign that it's not "AI" marketing bullshit (less of a risk with open source projects anyway). I'm pretty wary of "AI" audio software that's only a few megabytes.

[-] interdimensionalmeme@lemmy.ml 10 points 10 months ago

Tensorflowlite models are tiny, but they're potentially as much an audio revolution as synthetizer were in the 70s. It's hard to tell if that's what we're looking at here.

[-] Neato@ttrpg.network 1 points 10 months ago

Why are they that big? Is it more than code? How could you get to gigabytes of code?

[-] General_Effort@lemmy.world 51 points 10 months ago

Currently, AI means Artificial Neural Network (ANN). That's only one specific approach. What ANN boils down to is one huge system of equations.

The file stores the parameters of these equations. It's what's called a matrix in math. A parameter is simply a number by which something is multiplied. Colloquially, such a file of parameters is called an AI model.

2 GB is probably an AI model with 1 billion parameters with 16 bit precision. Precision is how many digits you have. The more digits you have, the more precise you can give a value.

When people talk about training an AI, they mean finding the right parameters, so that the equations compute the right thing. The bigger the model, the smarter it can be.

Does that answer the question? It's probably missing a lot.

[-] Aatube@kbin.social 10 points 10 months ago* (last edited 10 months ago)

It's basically a huge graph/flowchart.

[-] Amir@lemmy.ml 7 points 10 months ago

They're composed of many big matrices, which scale quadratically in size. A 32x32 matrix is 4x the size of a 16x16 matrix.

[-] 9point6@lemmy.world 6 points 10 months ago* (last edited 10 months ago)

The current wave of AI is around Large Language Models or LLMs. These are basically the result of a metric fuckton of calculation results generated from running a load of input data in, in different ways. Given these are often the result of things like text, pictures or audio that have been distilled down into numbers, you can imagine we're talking a lot of data.

(This is massively simplified, by someone who doesn't entirely understand it themselves)

[-] bamboo@lemm.ee 33 points 10 months ago

It seems reasonable given it includes multiple AI models.

[-] Fisch@lemmy.ml 7 points 10 months ago

2gb is pretty normal for an AI model. I have some small LLM models on my PC and they're about 7-10gb big. The big ones take up even more space.

this post was submitted on 13 Feb 2024
501 points (97.2% liked)

Technology

60123 readers
1578 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS