Comment by gamerson
8 hours ago
From the article...
> Claude has a glaring limitation: it only does 1:1 conversations. In business, work happens in groups. Today, if I want Claude's help with something that came up in a Slack thread, I have to relay the context between Slack and Claude by copy-pasting. This is absurd. I am not a sub-agent!
It seems to me that LLMs/Chatbots are engineered for one thing above ground-level truth and that is attention. The more people you bring into a shared context, the harder it seems it would become to retain people's attention.
Here is my anecdotal evidence for this: when I chat with a chatbot, I find its answers and line of thinking, relevant, compelling, and worth engaging with. However, when people share with me their "chatbot links" and I read their conversations with it, I have "yet" to find one compelling or worth engaging with. Maybe the newer models are good enough to retain the "attention" of a large group, but I don't see this happening.
No comments yet
Contribute on Hacker News ↗