← Back to context

Comment by sesm

5 hours ago

Sure, LLMs are good at generating text that humans can interpret as educated guesses. But a list of educated guesses is not 'enumerating options', because informed decision requires a complete list of options in order to not miss anything. Imagine using a Monte-Carlo method with sample size of 3 for finding a function extremum - that's the equivalent of using LLM-generated list of options for making a decision.