Comment by owebmaster
10 months ago
> the idea that LLMs are therefore in some sense and to some degree human-like.
This is 100% true, isn't it? It is based on the corpus of humankind knowledge and interaction, it is only expected that it would "repeat" human patterns. It also makes sense that the way to evolve the results we get from it is to mimic human organization, politics, sociology in the a new layer on top of LLMs to surpass current bottlenecks, just like they were used to evolve human societies.
>It is based on the corpus of humankind knowledge and interaction
Something being based on X or using it as source material doesen't guarantee any kind of similarity though. My program can also contains the entire text of wikipedia, and only ever outputs the number 5.
I'd love a further description of how you can have a program with the entire text of wikipedia that only ever outputs 5. It is not immediately obvious to me how that is possible.
Assuming the text of wikipedia is meaningfully used in the program, of course. A definition of "meaningful" I will propose is code which survives an optimization loop into the final resulting machine code and isn't hidden behind some arbitrary conditional. That seems reasonable as a definition of a program "containing" something.