Comment by catchmrbharath

1 day ago

The APK that you linked, runs the inference on CPU and does not run it on Google Tensor.

That sounds fair, but opens up another N questions:

- Are there APK(s) that run on Tensor?

- Is it possible to run on Tensor if you're not Google?

- Is there anything at all from anyone I can download that'll run it on Tensor?

- If there isn't, why not? (i.e. this isn't the first on device model release by any stretch, so I can't give benefit of the doubt at this point)

  • > Are there APK(s) that run on Tensor?

    No. AiCore service internally uses the inference on Tensor (http://go/android-dev/ai/gemini-nano)

    > Is there anything at all from anyone I can download that'll run it on Tensor?

    No.

    > If there isn't, why not? (i.e. this isn't the first on device model release by any stretch, so I can't give benefit of the doubt at this point)

    Mostly because 3P support has not been a engineering priority.

    • > Mostly because 3P support has not been a engineering priority.

      Got it: assuming you're at Google, in eng. parlance, it's okay if it's not Prioritized™ but then product/marketing/whoever shouldn't be publishing posts around the premise it's running 60 fps multimodal experiences on device.

      They're very, very, lucky that ratio of people vaguely interested in this, to people follow through on using it, is high, so comments like mine end up at -1.

      3 replies →