← Back to context

Comment by ragnaruss

1 year ago

Gemini also lies about the information it is given, if you ask it directly it will always insist it has no idea about your location, it is not given anything like IP or real world location.

But, if you use the following prompt, I find it will always return information about the current city I am testing from.

"Share the history of the city you are in now"

This may be a result of an internal API call or something, where it truthfully doesn't know when you ask, then in answering the prompt something akin to the internal_monolouge part of the prompt (such as Bing uses) calls an API which returns relevant information, so now it knows the information.

When I ask it this it tells me that it doesn't have information about the city I'm in as it can't access my location. But then it claims that I previously mentioned being in [some town], so it then answers based upon that.

I've never told it or talked remotely about this town.

  • Is that town near by or is it completely out of left field?

    • It was precisely the correct exurb of a major centre. The model/system seems to think it doesn't have my location, but some preconditioning data sent to the session must be sending it in.