You put people in nice little drawers, the skeptics, and the non-skeptics. It is reductive and most of all, it’s polarizing. This is how US politics have become and we should avoid this here.
A LLM is essentially the world information packed into a very compact format. It is the modern equivalent of the Library of Alexandria.
Claiming that your own knowledge is better than all the compressed consensus of the books of the universe, is very optimistic.
If you are not sure about the result given by a LLM, it is your task as a human to cross-verify the information. The exact same way that information in books is not 100% accurate, and that Google results are not always telling the truth.
As someone who has followed Thomas' writing on HN for a long time... this is the funniest thing I've ever read here! You clearly have no idea about him at all.
Especially coming from you I appreciate that impulse, but I had the experience of running across someone else the Internet (or Bsky, at least) believed I had no business not knowing about, and I did not enjoy it, so I'm now an activist for the cause of "people don't need to know who I am". I should have written more clearly above.
That is no different from pretty any other person in the world. If I interview people to catch them on mistakes, I will be able to do exactly that. Sure, there are some exceptions, like if you were to interview Linus about Linux. Other than that, you'll always be able to find a fluke in someone's knowledge.
None of this makes me 'snap out' of anything. Accepting that LLM's aren't perfect means you can just keep that in mind. For me, they're still a knowledge multiplier and they allow me to be more productive in many areas of life.
Not at all. Useful or not, LLMs will almost never say "I don't know". They'll happily call a function to a library that never existed. They'll tell you "Incredible idea! You're on the correct path! And you can easily do that with so and so software", and you'll be like "wait what, that software doesn't do that", and they'll answer "Ah, yeah, you're right, of course."
Your comment is ambiguous; what exactly do you refer to by "that"?
[flagged]
You put people in nice little drawers, the skeptics, and the non-skeptics. It is reductive and most of all, it’s polarizing. This is how US politics have become and we should avoid this here.
Yeah, putting labels on people is not very nice.
[flagged]
10 month old account talking like that to the village elder
In fairness, the article is a lot more condescending and insulting to its readers than the comment you're replying to.
A LLM is essentially the world information packed into a very compact format. It is the modern equivalent of the Library of Alexandria.
Claiming that your own knowledge is better than all the compressed consensus of the books of the universe, is very optimistic.
If you are not sure about the result given by a LLM, it is your task as a human to cross-verify the information. The exact same way that information in books is not 100% accurate, and that Google results are not always telling the truth.
28 replies →
As someone who has followed Thomas' writing on HN for a long time... this is the funniest thing I've ever read here! You clearly have no idea about him at all.
Especially coming from you I appreciate that impulse, but I had the experience of running across someone else the Internet (or Bsky, at least) believed I had no business not knowing about, and I did not enjoy it, so I'm now an activist for the cause of "people don't need to know who I am". I should have written more clearly above.
1 reply →
One would hope the experience leads to the position, and not vice-versa.
... you think tptacek has no expertise in cryptography?
That is no different from pretty any other person in the world. If I interview people to catch them on mistakes, I will be able to do exactly that. Sure, there are some exceptions, like if you were to interview Linus about Linux. Other than that, you'll always be able to find a fluke in someone's knowledge.
None of this makes me 'snap out' of anything. Accepting that LLM's aren't perfect means you can just keep that in mind. For me, they're still a knowledge multiplier and they allow me to be more productive in many areas of life.
Not at all. Useful or not, LLMs will almost never say "I don't know". They'll happily call a function to a library that never existed. They'll tell you "Incredible idea! You're on the correct path! And you can easily do that with so and so software", and you'll be like "wait what, that software doesn't do that", and they'll answer "Ah, yeah, you're right, of course."
5 replies →