← Back to context

Comment by Sai_Praneeth

3 months ago

idk if this is only for me or happened to others as well, apart from the glaze, the model also became a lot more confident, it didn't use the web search tool when something out of its training data is asked, it straight up hallucinated multiple times.

i've been talking to chatgpt about rl and grpo especially in about 10-12 chats, opened a new chat, and suddenly it starts to hallucinate (it said grpo is generalized relativistic policy optimization, when i spoke to it about group relative policy optimization)

reran the same prompt with web search, it then said goods receipt purchase order.

absolute close the laptop and throw it out of the window moment.

what is the point of having "memory"?