935
I wish I was as bold as these authors.
(discuss.tchncs.de)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
What makes the "spicy autocomplete" perspective incomplete is also what makes LLMs work. The "Attention is All You Need" paper that introduced attention transformers describes a type of self-awareness necessary to predict the next word. In the process of writing the next word of an essay, it navigates a 22,000-dimensional semantic space, And the similarity to the way humans experience language is more than philosophical - the advancements in LLMs have sparked a bunch of new research in neurology.