← Back to context

Comment by marcosdumay

14 hours ago

Shipping the model with the browser is exactly the opposite of what you are claiming.

The alternative is sending the data to Google.

Back to the assumptions.

If the onboard LLM means no data sending and you get your own little service wholly subservient to you like a good little program. That's nice!

If the onboard LLM means better data filtering, possibly even exploration of the local system, to send information to Google while lessening their datacentre bills running LLM services. That seems a little underhanded to just bake into things without notification.

Pick your assumption, you get your outcome. What are your assumptions?

  • Why assume? it should be observable. You can check the code and data traffic to see how it is used.

    • Can't observe the future, so learn from the past.. or use common sense. You don't react to a stranger or a mysterious camera in your household by saying, wow ok, let's see if anything bad happens.