← Back to context

Comment by rldjbpin

2 months ago

if you focus out of local LLMs (also served using dedicated apps), the title holds a lot of promise. case in point: WASM and WebGPU

the edge/on-device AI use cases on smartphones can also extend without user friction through web apps built on the above standards. perhaps one day there will be a "WebNPU" or just get supported through existing standards.

there are already some use cases on apps but it usually fallbacks on cpu. perhaps it could be the hw accelerated moment that we saw with video on the web.