Comment by aprilnya
8 hours ago
I’ve heard stories of people using the Meta smart glasses to help with reduced vision, i.e. asking the LLM assistant what you’re looking at, asking it to read a label, etc. The LLM assistant can see the camera feed so it is capable of doing that.
However things like the urgent warnings you mentioned don’t exist yet.
Hearing about the way people with bad vision use these glasses kind of changed my viewpoint on them to be honest; for the average person it might seem useless to be able to ask an LLM about what you’re looking at, but looking at it from an accessibility standpoint it seems like a really good idea.
No comments yet
Contribute on Hacker News ↗