Comment by kushalpatil07
1 day ago
I was working on on-device AI for 3 years. This was the prime idea we were exploring, how can someone undercut the OS providers and ship an LLM that other apps can also use on-device. Like if meta decides to do this, it can serve an API to all mobile app companies for an on-device LLM long before the OS is there. This is Google's way of reaching LLM distribution on laptops, since they don't have their own
No comments yet
Contribute on Hacker News ↗