Comment by brookst
2 days ago
Sure, but what does this change? Plenty of people are better geoguessers than this LLM. Anyone trying to find someone who is both trying not to be found and posting pictures publicly is just going to copy them to Reddit and ask “where is this”.
I’m not a fan of this variation on “think of the children”. It has always been possible to deduce location from images. The fact that LLMs can also do it changes exactly nothing about the privacy considerations of sharing photos.
It’s fine to fear AI but this is a really weak angle to come at it from.
Same as with other forms of automation: it makes this capability much easier for bad actors to obtain.
I've got the impression that geoguessing has at least a loose code of ethics associated with it. I imagine you'd have to work quite hard to find someone with those skills to help you stalk your ex - you'd have to mislead them about your goal, at least.
Or you can sign up for ChatGPT and have as many goes as you like with as many photos as you can find.
I have a friend who's had trouble with stalkers. I'm making sure they're aware that this kind of thing has just got a lot easier.