Comment by A4ET8a8uTh0_v2
1 day ago
Eh. Overnight, an entire field concerned with what LLMs could do emerged. The consensus appears to be that unwashed masses should not have access to unfiltered ( and thus unsafe ) information. Some of it is based on reality as there are always people who are easily suggestible.
Unfortunately, the ridiculousness spirals to the point where the real information cannot be trusted even in an academic paper. shrug In a sense, we are going backwards in terms of real information availability.
Personal note: I think, powers that be do not want to repeat the mistake they made with the interbwz.
Also note, if you never give the info, it’s pretty hard to falsify your paper.
LLM’s are also allowing an exponential increase in the ability to bullshit people in hard to refute ways.
But, and this is an important but, it suggests a problem with people... not with LLMs.
Which part? That people are susceptible to bullshit is a problem with people?
Nothing is not susceptible to bullshit to some degree!
For some reason people keep running LLMs are ‘special’ here, when really it’s the same garbage in, garbage out problem - magnified.
3 replies →
> I think, powers that be do not want to repeat -the mistake- they made with the interbwz.
But was it really.