← Back to context

Comment by TeMPOraL

2 months ago

This is becoming a common pattern everywhere now.

Case in point (and sorry for bringing up this topic), LLM providers seem to be doubling down on automatic model selection, and marketing it as a feature that improves experience and response quality for the users, even though it's a blatant attempt to cut serving costs down by tricking users (or taking the choice away) into querying a cheaper, weaker model. It's obviously not what users want - in this space, even more than in video streaming, in 90%+ of end-user cases, what the user wants is the best SOTA model available.

At least with YouTube, I recall them being up front about this in the past, early in the COVID-19 pandemic - IIRC the app itself explained in the UI that the default quality is being lowered to conserve bandwidth that suddenly got much more scarce.

How’d I miss this!

Bloomberg, March 24, 2020:

“YouTube to Limit Video Quality Around the World for a Month”

Guess it’s like a “temporary” tax… how’s a tax ever going to go away once collection starts :)