Comment by OccamsMirror
14 days ago
Well said. It's wild when you think of how many "AI" products are out there that essentially entrust an LLM to make the decisions the user would otherwise make. Recruitment, trading, content creation, investment advice, medical diagnosis, legal review, dating matches, financial planning and even hiring decisions.
At some point you have to wonder: is an LLM making your hiring decision really better than rolling a dice? At least the dice doesn't give you the illusion of rationality, it doesn't generate a neat sounding paragraph "explaining" why candidate A is the obvious choice. The LLM produces content that looks like reasoning but has no actual causal connection to the decision - it's a mimicry of explanation without true substance of causation.
You can argue that humans do the same thing. But post-hoc reasoning is often a feedback loop for the eventual answer. That's not the case for LLMs.
> it doesn't generate a neat sounding paragraph "explaining" why candidate A is the obvious choice.
Here I will argue that humans do the same thing. For any business of any size recruitment has been pretty awful in recent history. The end user, that is the manager the employee will be hired under is typically a later step after a lot of other filters, some automated some not.
At the end of the day the only way is to measure the results. Do LLMs produce better hiring results than some outside group?
Also, LLMs seem very good at medical pre-diagnosis. If you accurately portray your symptoms to them they come back with a decent list of possible candidates. In barbaric nations like the US where medical care can easily lead to bankruptcy people are going to use it as a filter to determine if they should go in for a visit.