Comment by barbazoo
2 days ago
This sounds like a way to have the LLM client render dynamic UI. Is this for use during the chat session or yet another way to build actual applications?
2 days ago
This sounds like a way to have the LLM client render dynamic UI. Is this for use during the chat session or yet another way to build actual applications?
Google PM here. Right now, it’s designed for rendering UI widgets inline with a chat conversation - it’s an extension to a2a that lets you stream JSON defining UI components in addition to chat messages.
Google SWE working in this space here. Look up my username (minus the digit) on Moma, let's talk. I can't ID you from your HN handle.