← Back to context

Comment by VectorLock

6 days ago

Because I hope that someone who's hands were required to assemble the recipe didn't blindly add ingredients like "bleach" if the AI happened to hallucinate them.

A naive hope perhaps, but this ignores the risk of LLMs just creating a bad recipe based on the blind combination of various recipes in their training data.

  • As the parent comment said the people seemed to be enjoying the food otherwise so the LLM didn't create an unpalatable combination, and I can't think of any combination of edible and unharmful ingredients that might combine to something harmful (when consuming a reasonable amount)

    • This is exactly what makes it dangerous. Food can taste ok but actually cause you to get sick. Not all bacteria is going to taste off. I'm assuming you're not a chef because if you were then you'd know how absurd your statement is.

      For a super simple example, if you don't properly handle or cook raw meat then you risk getting sick even though the food might not immediately taste bad. Maybe that's obvious to you but might not be to the person preparing the food. Another example: Rhubarb pie is supposed to be made with the leaves and not the stalk because the stalk is poisonous and can cause illness. Just kidding, it's actually the other way around but if you were just reading a ChatGPT recipe that made that mistake maybe you wouldn't have caught it.

    • If meat was involved, the cooking time may have been unsafe if other precautions weren't taken by the cook (like checking the internal temperature).

Your personal hope aside, why is it irrational for them?

  • Because the implication is a random human-generated recipe from wherever has any more risk than the one generated. People who would trust a 'bleach recipe' from AI would also trust it from a Tiktok video or whatever.

    Edit: it is irrational to think this way when someone prepares your food¿

    • But that's just a made up implication to make the other one look better, that's not the only true alternative, so doesn't explain irrationality