Comment by coredog64
3 hours ago
A customer service chatbot can require more than one LLM call per response to the point that latency anywhere in the system starts to show up as a degraded end-user experience.
3 hours ago
A customer service chatbot can require more than one LLM call per response to the point that latency anywhere in the system starts to show up as a degraded end-user experience.
No comments yet
Contribute on Hacker News ↗