Comment by macrolime
2 years ago
Another thing I have noticed is that if you use ChatGPT and it at some points uses Bing to look up something, it becomes super lazy afterwards, going from page long responses on average to a single paragraph.
2 years ago
Another thing I have noticed is that if you use ChatGPT and it at some points uses Bing to look up something, it becomes super lazy afterwards, going from page long responses on average to a single paragraph.
So the more advanced the AI, the more human-like it becomes. Senior Programmer level AI will spend all computing resources browsing memes.
It probably has to do with the extended context window. Keeping websites in there is kind of a hassle. But I actually consider that a feature, not a bug. If I have ChatGPT use the internet, I don't want a full page answer - especially not on the relatively slow GPT4. It's also a hassle if you're unsure about the validity of the output. In that case I might as well browse myself. Just give me a short preview so I can either start searching on my own or ask more questions.
You can/should make a custom GPT that isn't allowed to use Bing. Works much better that way
Use ChatGPT Classic.
If answer is too lazy, you can tell it to elaborate. However, repairing a lazy context is sometimes slow and unreliable.
To avoid that, use backtracking and up the pressure for detailed answers. Then consider taking the least lazy of 2 or 3 samples.
A good prompt for detailed answers is Critique Of Though, an enhanced chain of thought technique. You ask for a search and a detailed response with simple sections including analysis, critique and key assumptions.
It will expend more tokens, get more ideas out, and achieve higher accuracy. It will also be less lazy and more liable to recover from laziness or mistakes.
TLDR; if GPT4 is being lazy, backtrack and request a detailed multi section critical analysis.