Comment by PaulHoule

5 days ago

I had a business partner that I agreed on a lot of things with but not about Intel. My assumption was that any small software package from Intel, such as a graph processing toolkit, was trash. He thought they could do no wrong.

Intel really is good at certain kinds of software like compilers or MKL but my belief is that organizations like that have a belief in their "number oneness" that gets in their way of doing anything that it outside what they're good at. Maybe it is the people, processes, organization, values, etc. that gets in the way. Or maybe not having the flexibility to know that what is good at task A is not good at task B.

I saw always intel as a HW company making terribly bad SW. Anywhere I saw intel SW I would run away. Lately I used a big open source library from them, which is standard in the embedded space. Work great, but if you look the code you will be puking for a week.

  • In my experience Intel's WiFi and Bluetooth drivers on Linux are, by far, the best. They're reliably available on the latest kernel and they actually work. After having used other brands on Linux, I have no intention of getting non-intel WiFi or Bluetooth any time soon. The one time that I found a bug, emailing them about it got me in direct contact with the developers of the driver.

    I had a different non-Intel WiFi card before where the driver literally permanently fried all occupied PCIe slots -- they never worked again and the problem happened right after installing the driver. I don't know how a driver such as this causes that but it looks like it did.

    • Yes, their open source drivers had a painful birth, but they are good once they're sanded and sharpened with the community.

      However, they somehow managed to bork e1000e driver in a way that certain older cards sometimes fail to initialize and require a reboot. I have been bitten by the bug, and the problem was fixed later by reverting the problematic patch in Debian.

      I don't know current state of the driver since I passed the system on. Besides a couple of bad patches in their VGA drivers, their cards are reliable and works well.

      From my experience, their open source driver quality does not depend on the process, but on specific people and their knowledge and love for what they do.

      I don't like the aggressive Intel which undercuts everyone by shady tactics, but I don't want them to wither and die, either, but seems like their process, frequency and performance "tricks" are biting them now.

    • Interesting. Does Bluez fall under that umbrella?

      I have found bluez by far the hardest stack to use for Bluetooth Low Energy Peripherals. I have used iOS’s stack, suffered the evolution of the Android stack, used the ACI (ST’s layer), and finally done just straight python to the HCI on pi. Bluez is hands down my least favorite.

    • that's only because their hardware is extremely simple.

      so the driver have little to screw up. but they still manage to! for example, the pci cards are all broken, when it's literary the same hardware as the USB ones.

  • The team working on their Realsense depth cameras was doing great work on the SDK, in my opinion.

    Frequent releases, GitHub repo with good enough user interaction, examples, bug fixing and feedback.

    > such as a graph processing toolkit

This is oddly specific. Can you share the exact Intel software toolkit?

    > "number oneness"

Why does this not affect NVidia, Amazon, Apple, or TSMC?

  • The affliction he’s imputing is born of absolute dominance over decades. Apple has never had the same level of dominance, and NVidia has only had it for two or three years.

    It could possibly come to haunt NVidia or TSMC in decades to come.

  • A friend who developed a game engine from scratch and is familiar with inner workings and behavior of NVIDIA driver calls it an absolute circus of a driver.

    Also, their latest consumer card launches are less then stellar, and the tricks they use to pump up performance numbers are borderline fraud.

    As Gamers Nexus puts it "Fake prices for fake frames".

    • My response is somewhat tangential: When I look at GPUs strictly from the perspective of gaming performance, the last few generations have been so underwhelming. I am not a gamer, but games basically look life-like at this point. What kind of improvements are gamers expecting going forward? Seriously, a mid-level GPU has life-like raytracing at 4K/60HZ. What else do you need for gaming? (Please don't read this as looking down upon gaming; I am only questioning what else gamers need from their GPUs.)

      To me, the situation is similar with monitors. After we got the pixel density of 4K at 27 inches with 60Hz refresh rate (enough pixels, enough inches, enough refresh rate), how can it get any better for normies? Ok, maybe we can add HDR, but monitors are mostly finished, similar to mobile phones. Ah, one last one: I guess we can upgrade to OLED when the prices are not so scandalous. Still, for the corporate normies, who account for the lion's share of people siting in front of 1990s-style desktop PCs with a monitor, they are fine with 4K at 27 inches with 60Hz refresh rate forever.

      2 replies →

See the funny thing is, even with all of this stuff about Intel that I hear about (and agree with as reported), I also just committed a cardinal sin just recently.

I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.

The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.

And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.

Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.

  • It's a bummer you've had poor experiences with ATI and later AMD, especially on a new system. I have an AMD laptop with Ryzen 7 7840U which includes a Radeon 780M for integrated graphics and it's been rock solid. I tested many old and new titles on it, albeit at medium-ish settings.

    What kind of problems did you see on your laptop?

    • Built a PC with a top-of-the line AMD CPU, it's great. AMD APUs are great in dedicated gaming devices like the XBOX ONE, PS 4 and 5 and Steam Deck.

      On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"

    • Not tharkun__:

      AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.

    • Well back when, literally 25 years ago, when it was all ATI, there were constant driver issues with ATI. I think it's a pretty well known thing. At least was back when.

      I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.

      On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.

      So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".

      Still iGPU. I'm not going to shell out thousands of dollars here.

      And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.

      I stuck with it and kept doctoring for over 6 months now.

      I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.

      9 replies →

  • Did you time travel from 2015 or something? Haven't heard of anyone having AMD issues in a very long time...

  • Meanwhile PC gamers have no trouble using their AMD GPUs to play Windows games on Linux.

    • That's actually something I have not tried at all again yet.

      Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.

      I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.

      I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.

      1 reply →

  • I wish I had an AMD card. Instead our work laptops are X1 extremes with discrete nvidia cards and they are absolutely infuriating. The external outputs are all routed through the nvidia card, so one frequently ends up with the fan blowing on full blast when plugged into a monitor. Moreover, when unplugging the laptop often fails to shutdown the discrete graphics card so suddenly the battery is empty (because the discrete card uses twice the power). The Intel card on the other hand seems to prevent S3 sleep when on battery, i.e. the laptop starts sleeping and immediately wakes up again (I chased it down to the Intel driver but couldn't get further).

    And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).

    All that just for some negligible graphics power that I'm never using on the laptop.