Comment by Macha
19 hours ago
I always thought that the idea that "revealed preferences" are preferences, discounts that people often make decisions they would rather not. It's like the whole idea that if you're on a diet, it's easier to not have junk food in the house to begin with than to have junk food and not eat more than your target amount. Are you saying these people want to put on weight? Or is it just they've been put in a situation that defeats their impulse control?
I feel a lot of the "revealed preference" stuff in advertising is similar in advertisers finding that if they get past the easier barriers that users put in place, then really it's easier to sell them stuff that at a higher level the users do not want.
Perfectly put. Revealed preference simply assumes impulses are all correct, which is not the case, an exploits that.
Drugs make you feel great, in moderation perfectly acceptable, constantly not so much.
Absolutely. Nicotine addiction can meet the criteria for a revealed preference, certainly an observed choice
One example I like to use is schadenfreude. The emotion makes us feel good and bad at the same time: it's pleasurable but in an icky way. So should social media algorithms serve schadenfreude? Should algorithms maximize for pleasure (show it) or for some kind of "higher self" (don't show it). If they maximize for "higher self" then which designer gets to choose what that means?