← Back to context

Comment by 9dev

9 hours ago

Does anyone work on smart glasses for blind people yet? Something with blackened glass, obviously, that uses image recognition to translate visual input into text via (headphone) audio to the wearer.

That would allow for urgent warnings (approaching a street, walking towards obstacle [say, an electric scooter or a fence]), scene descriptions on request, or help finding things in the view field. There's probably a lot more you could do with this to help improve quality of life for fully blind people.

I’ve heard stories of people using the Meta smart glasses to help with reduced vision, i.e. asking the LLM assistant what you’re looking at, asking it to read a label, etc. The LLM assistant can see the camera feed so it is capable of doing that.

However things like the urgent warnings you mentioned don’t exist yet.

Hearing about the way people with bad vision use these glasses kind of changed my viewpoint on them to be honest; for the average person it might seem useless to be able to ask an LLM about what you’re looking at, but looking at it from an accessibility standpoint it seems like a really good idea.

If the top-level poster succeeds, the resulting device could possibly disable devices that allow blind people to see. This could open up another liability channel.

Every time I read about smart glasses I wonder the same thing. Obviously the technology isn’t perfect, but it seems that even a basic pair of smart glasses with primitive image processing could be life-changing for a completely blind person. Yet as far as I can tell, most blind people don’t use technology at all for this purpose.

Unfortunately, the HN website is extremely unfriendly to users relying on assistive technologies (lack of ARIA tags, semantic elements etc.), otherwise there might be more blind people commenting here who could shed light on such things, no pun intended.

  • Makes me wonder just how big the market for such a device would be, and if it would attract investors…