28
you are viewing a single comment's thread
view the rest of the comments
[-] dualmindblade@hexbear.net 4 points 5 months ago

It really is, another thing I find remarkable is that all the magic vectors (features) were produced automatically without looking at the actual output of the model, only activations in a middle layer of the network, and using a loss function that is purely geometric in nature, it has no idea the meaning of the various features it is discovering.

And the fact that this works seems to confirm, or at least almost confirm, a non trivial fact about how transformers do what they do. I always like to point out that we know more about the workings of the human brain than we do about the neural networks we have ourselves created. Probably still true, but this makes me optimistic we'll at least cross that very low bar in the near future.

this post was submitted on 23 May 2024
28 points (100.0% liked)

technology

23307 readers
294 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS