![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://mander.xyz/pictrs/image/dbeda0de-d3fb-4fab-8703-3e52e72cb4db.jpeg)
What makes the “spicy autocomplete” perspective incomplete is also what makes LLMs work. The “Attention is All You Need” paper that introduced attention transformers describes a type of self-awareness necessary to predict the next word. In the process of writing the next word of an essay, it navigates a 22,000-dimensional semantic space, And the similarity to the way humans experience language is more than philosophical - the advancements in LLMs have sparked a bunch of new research in neurology.
If our present reality was described accurately in a novel written in 1985, it would be firmly cyberpunk.