Comment by chaps
6 months ago
I highly recommend playing with embeddings in order to get a stronger intuitive sense of this. It really starts to click that it's a representation of high dimensional space when you can actually see their positions within that space.
> of this
You mean that LLMs are more than just the matmuls they're made up of, or that that is exactly what they are and how great that is?
Not making a qualitative assessment of any of it. Just pointing out that there are ways to build separate sets of intuition outside of using the "usual" presentation layer. It's very possible to take a red-team approach to these systems, friend.
They don't want to. It seems a lot of people are uncomfortable and defensive about anything that may demystify LLMs.
It's been a wake up call for me to see how many people in the tech space have such strong emotional reactions to any notions of trying to bring discourse about LLMs down from the clouds.
The campaigns by the big AI labs have been quite successful.
1 reply →
Yes, and what I was trying to do is learn a bit more about that alternative intuition of yours. Because it doesn't sound all that different from what's described in the OP, or what anyone can trivially glean from taking a 101 course on AI at university or similar.
5 replies →