← Back to context

Comment by akomtu

2 years ago

Gpt4 does an ok job translating texts that aren't complex, but if you read the original and its translation side by side, you'll see that gpt4 still makes dumb mistakes every few sentences, hallucinates stuff when it runs into cryptic words it's not familiar with, and sometimes omits important passages. Gpt4 is like a very productive, but clueless newbie.

> gpt4 still makes dumb mistakes every few sentences, hallucinates stuff when it runs into cryptic words it's not familiar with, and sometimes omits important passages

This, but in all domains where you are an expert - it becomes apparent that GPT-4 makes stupid mistakes. It makes mistakes frequently. Can't seriously see it replacing humans soon, it is far from having four 9's of reliability.

Probably needs 99.99% accuracy to work alone unsupervised by humans, because at each decision step it incurs the error rate again and again, so error rate is growing exponentially in sequence length. Coupled with the "forward only" thinking pattern of LLMs that doesn't allow backtracking and planning this error rate kills autonomy.

Recently, GPT-4-Vision was found to have poor OCR accuracy. Smart but stupid, same story.

And a demon summoning is one of the worst imaginable places to have dumb mistakes in the instructions and incantations.