← Back to context

Comment by lkm0

7 days ago

This sounds misguided. In the little experience I had, I've seen that models get basic knowledge so absolutely wrong that giving them any sort of independence will not result in publications that positively impact a professor's reputation, or contribute to science. Or at least the reviews and papers I read that had AI content did not give me the impression that we should have more of this. And they require much more supervision, with the added issue that they cannot learn in the long term through your interactions, and without the enjoyment of teaching something to someone. They're really good at finding papers though. Perhaps because navigating search engines has become a pain. Perhaps this will be the case in the future, but saying you're tempted right now is like saying you're being tempted to replace your HPC with quantum computers. It's a bit early.

Upon reading this:

> The issue is not whether my students are valuable. In the long run, they are invaluable. The issue is that their value emerges slowly, whereas AI delivers immediate returns.

I had the thought that it's more like hiring only autistic/on-the-spectrum employees that will on whims do exactly what their interpretation was, or possibly worse literally what you said without considering further consequences.

  • Sounds a bit like externalising the learning cost (of AI models) is preferred to investing the time into training the students.

    • You think? I will get banned from HN if I bring up that these models are fundamentally theft but we just don't put them in jail because they had the foresight to bribe the trump admin like everyone else who wants favor did.

Also 90% of citations generated by AI are wrong or straight up don’t even exist. It’s got such a long way to go to be able to reliably write credible papers.

[Source: https://www.reddit.com/r/AskReddit/comments/o6hlry/statistic... ]

  • Your source is a 5 year AMA post that it itself claims is made it.

    While funny, it does nothing to prove your assertion.

    • >While funny, it does nothing to prove your assertion.

      Unless that citation was generated by AI.

    • I think you missed the point. Yes it was meant to be humorous, and also to emphasise one of the reasons AI-generated citations are completely untrustworthy, especially with the growing number of AI-generated (junk) papers being published.

      No, I had no intention of trying to offer a real source for the accuracy of AI generated citations. It is not hard to Google, search HN or even (ironically) use AI to search, to find numerous relatively recent studies discussing the problem or highlighting specific cases of respected journals/conferences publishing papers with junk citations.

      5 replies →