← Back to context

Comment by antisol

8 months ago

> An attitude like what?

You're implicitly trying to tell people (as others have more explicitly done) that rather than using the hardware they want to use - for perfectly legitimate reasons (which you've chosen to ignore) - that they should go buy something else, just so that they can adopt your preferred software stack.

I just want you to realise that with every word you type you're digging a deeper hole: when you're digging in, shifting goalposts, ignoring perfectly legitimate issues, and spewing your insane troll logic, what you're doing is making people more hostile and less interested in adopting wayland: I didn't actually give much of a shit before, I just thought wayland was a pretty funny case of vapourware that may or may not come to fruition one day, but now that I see just how evangelical and nonsensically-ideologically-driven some of you rabid fanboys are, It'll take a fairly significant shift to make me want to try it again. Your nonsense trolling here has done wayland a disservice. Congratulations.

I refer you back to this passage in my original anecdote:

  "I want to stress that I was *hoping it would work* - I was not out to find a reason not to use wayland"

> don't like nvidia because they've been actively hostile to OSS for decades

Hey guess what? This might come as a shock, but nvidia release drivers for X. And they have for decades. And they work just fine. How do their drivers work on wayland? Oh that's right, they basically don't - I posted links about that what feels like a hundred thousand messages ago. Your reaction was to deflect from that by saying they're not very good anyway. Which completely fails to even attempt to respond to the issue I pointed out.

> Also what I said about nvidia is 100% factually true.

You might note if you read back over the history of this thread that I actually didn't ask for your opinion of nvidia or their hardware at any point. Nor did I ask for an "objective" evaluation of the performance of nvidia cards relative to others. The reason I didn't ask that is because I don't actually give a shit what your opinion about nvidia and their hardware is. I hope this clears things up for you.

> they are objectively the worst option

Not for use cases that explicitly require nvidia cards and don't support anything else.

Also, somewhat related, it seems that like a ton of things I've said, you forgot to address my point that cuda is basically the only game in town when it comes to ML. I guess that must have been an oversight and not at all intentional and ideologically driven deflection.