← Back to context

Comment by 1vuio0pswjnm7

8 months ago

There should be a great diversity in user agents because there is great diversity in personal tastes.

One person's user agent might be another person's "software I would never use".

As a text-only web user I am continually amazed, thirty years in, that web developers and now their CDN service providers are _still_ making incorrect assumptions about what user agent I am using. They are wrong every single time. There is almost zero focus on rate limits but hyperfocus on user agent string or other headers. For most sites I send no user-agent header and this works fine. But when sites want certain headers this tells me the purpose is not "protecting" servers from being overloaded, it is "protecting" servers from web users who will not involuntarily provide commercially useful data/information so that access to them as ad targets can be sold for profit.

Choice of user agent should make no difference. The JSON I'm getting is the same regardless of what device or software I am using. I decide what I want to do with the JSON after I retrieve it.

Imagining how things could be different, there could be "commercial" user agents that people use for accessing their bank acconts online and for other commercial transactions. There could also be "non-commercial" user agents that people use to read HN. Unfortunately, the way things are now people are using commercial browsers for non-commercial web use and exposing themselves 24/7 to unecessary tracking and advertising.

Personally, I only use a commercial user agent infrequently. I'm not doing many commercial tranasctions over the web. Most times, I am using non-commercial user agents. I see no ads and can focus on the text.

There are easily less than 1,000 people using the internet in the way that you do. The internet is not immune from cost-benefit.

  • I think I see an underlying point though. What other Internet protocol or service requires the user client to supply endless additional arbitrary metadata to even gain access to a resource, let alone receive information? Not even email is that cumbersome for the clientside. Although it is the way it is for better or worse.

  • Right, pack it up. You all heard the guy with the random username. Corporations have the power to make things convenient so I guess we should just give up and allow ubiquitous corporate control.

> it is "protecting" servers from web users who will not involuntarily provide commercially useful data/information

I don't think it comes down to that, I think it's more about the fact that your browser likely looks more like a bot than it does a human.

Also, rate limiting has a significant overhead and complexity at scale, where agent filtering is relatively cheap and easy to distribute. Though, this is largely a problem that has been resolved many, many times over and the additional overhead is not that bad. All said, I've met too many developers that don't conceptually understand public/private key encryption and would assume they'd mess up rate limiting.