Comment by BobbyJo

16 hours ago

> Nobody I know in real life, personally or at work, has expressed this belief.

TBF, most people in real life don't even know how AI works to any degree, so using that as an argument that parent's opinion is extreme is kind of circular reasoning.

> I have literally only ever encountered this anti-AI extremism (extremism in the non-pejorative sense) in places like reddit and here.

I don't see parent's opinions as anti-AI. It's more an argument about what AI is currently, and what research is supposed to be. AI is existing ideas. Research is supposed to be new ideas. If much of your research paper can be written by AI, I call into question whether or not it represents actual research.

> Research is supposed to be new ideas. If much of your research paper can be written by AI, I call into question whether or not it represents actual research.

One would hope the authors are forming a hypothesis, performing an experiment, gathering and analysing results, and only then passing it to the AI to convert it into a paper.

If I have a theory that, IDK, laser welds in a sine wave pattern are stronger than laser welds in a zigzag pattern - I've still got to design the exact experimental details, obtain all the equipment and consumables, cut a few dozen test coupons, weld them, strength test them, and record all the measurements.

Obviously if I skipped the experimentation and just had an AI fabricate the results table, that's academic misconduct of the clearest form.

> TBF, most people in real life don't even know how AI works to any degree

How about the authors who do research for NeurIPS? Do they know how AI works?

  • Who knows? Do NeurIPS have a pedigree of original, well sourced research dating back to before the advent of LLMs? We're at the point where both of the terms "AI" and "Experts" are so blurred it's almost impossible to trust or distrust anything without spending more time on due diligence than most subjects deserve.

    As the wise woman once said "Ain't nobody got time for that".

"If much of your research paper can be written by AI, I call into question whether or not it represents actual research" And what happens to this statement if next year or later this year the papers that can be autonomously written passes median human paper mark?