← Back to context

Comment by greygoo222

3 hours ago

Have you ever tried writing a long, complicated document with an LLM? The last 20% takes 99% of the work.

True, and even more true in the case that you barely understand what you're doing. That's a feature rather than a bug of this sort of paperwork; the person who's simply pestered ChatGPT until it says "great idea" won't cross that threshold at all, whereas this guy [and the bioinformatics processing chain and experts in the loop he found] crossed it in his spare time. If it was just the "two hours a night typing" as quoted in the article, LLMs can do it in no time.

"ChatGPT better at finding expert advice than filling in compliance forms" and even "getting workable results from latest generation Open Source bioinformatics tools possible for smart laymen with minimal background reading; learning enough to prove they aren't dangerous only takes marginally longer" doesn't sound nearly as bad as "layman asks ChatGPT to cure his dog's cancer, only hard bit is writing enough words to convince gatekeepers" as rendered by news coverage of this (and not really elaborated on more by the TFA). A rendering which really should trigger people's bullshit filters.

Other fields crossed the computers can find potential solutions easily a lot earlier (any idiot can put dimensions into pretty dumb civil engineering tools and get answers that are probably correct; don't as me how I know!) and actually have higher barriers (no, even if you actually learn the relevant physics as well you will still need to pass some elements of your home design via someone with the right professional liability insurance linked to their experience and formal qualifications)

> The last 20% takes 99% of the work.

Of course it does, since the first 80% take literal minutes! But if you compare to doing it entirely manually, it's still x5 more efficient.

Why would you do it all by hand (spending 200hours in the process…) when you're an “AI entrepreneur”…

In fairness, it pains me to see people as gullible as you are just because you like the idea of the story being true.