Comment by kube-system

3 months ago

Most of the people who are just looking at browser statistics for the purpose of managing a website are using simple tools that just simply collect data from user agent strings. Determining browser from this isn't 100% straightforward, but it's enough to give website operators a rough idea of what browser to target. This data was more important in the days when everything wasn't Chrome/Android/iOS, and it actually mattered what version of IE your users were running.

If you're doing fingerprinting for tracking purposes, you're gonna be tracking a lot more in-depth data.

But in the end, there are pretty much three types of Internet user today: 1. The person who uses the default browser installed on their device. 2. The user who always downloads Chrome when they first get a new computer. and 3. Nerds who do something else.

I don't disagree, but it makes for bad web development practices. Google Analytics is still the de facto king everywhere I've worked, but it's going to more often Safari and Firefox than other browsers due to tracking protections and users being more likely to run ad blockers. Then there's all the edge cases like Brave.

I don't remember the discrepancy that the study found, but it's significant.

So… we keep optimising for Chrome, as if that's the bulk of our audience. That makes things shittier for everybody else, and we think it's okay because they're such a small part of the group. This reminds me of a former client burning almost 9 million euros every year because they excluded IE6–8 from their reporting, yet they would account for 15% of the traffic.