Comment by refulgentis
21 hours ago
My post politely describing this blog post does not match Google's own app, running inference on Pixel, is downvoted to -1, below dead posts with one-off short jokes.
I am posting again because I've been here 16 years now, it is very suspicious that happened, and given the replies to it, we now know this blog post is false.
There is no open model that you can download today and run at even 1% of the claims in the blog post.
You can read a reply from someone indicating they have inside knowledge on this, who notes this won't work as advertised unless you're Google (i.e. internally, they have it binding to a privileged system process that can access the Tensor core, and this isn't available to third parties. Anyone else is getting 1/100th of the speeds in the post)
This post promises $150K in prizes for on-device multimodal apps and tells you it's running at up to 60 fps, they know it runs at 0.1 fps, Engineering says it is because they haven't prioritized 3rd parties yet, and somehow, Google is getting away with this.
No comments yet
Contribute on Hacker News ↗