Comment by bigfatkitten
5 days ago
It was a silly acquisition in the first place, and their justification clearly came from a coke-addled fever dream.
Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
There was some hope at the time that FPGAs could be used in a lot more applications in the data center. It is likely still feasible. Remember Hennessy published:
https://www.doc.ic.ac.uk/~wl/teachlocal/arch/papers/cacm19go...
And maybe this is/was a pipe dream - maybe there aren't enough people with the skills to have a "golden age of architecture". But MSFT was deploying FPGAs in the data center and there were certainly hopes and dreams this would become a big thing.
That was certainly the dream, but unfortunately for them it didn't turn out to be a new market.
I don't know enough about hardware to know why - why didn't this story play out as hoped?
15 replies →
It made their stock pop for a while which was all that mattered to Brian Krzanich who took the bonus and left the mess in the hands of Bob Swan who did the same things and left the mess ... (recursion her).
> Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
Yes, but pairing an FPGA somewhat tightly integrated with an actually powerful x86 CPU would have made an interesting alternative to the usual FPGA+some low end ARM combo that's common these days.
Sure, if they wanted to intel could have done what nvidia did with CUDA: Put the tech into everything, even their lowest end consumer devices, and sink hundreds of millions into tooling and developer education given away free of charge.
And maybe it would have lead somewhere. Perhaps. But they didn't.
It was the thought at the time that they'd do this. It's amazing that they don't seem to have actually tried ? Any sense as to why or what went wrong?
2 replies →
Yes, if they actually made the thing available, maybe people would have used it for something. There were several proofs of concept at the time, with some serious gains, ever for the uses that people ended up using CUDA.
But they didn't actually sell it. At least not in any form anybody could buy. So, yeah, we get the OP claiming it was an obvious technological dead-end.
And if they included it on lower-end chips (the ones they sold just a few years after they brought Altera), we could have basically what the RasPI 2040 is nowadays. Just a decade earlier and controlled by them... On a second thought, maybe this was for the best.
Applications that benefit from the Zynq-style combination (e.g. radio systems) generally take that approach because they have SWaP concerns that preclude the use of big x86 CPUs in the first place.
What's a SWaP concern?
4 replies →
It was a forced acquisition, iirc they made promises to Altera to get them to use their foundry, failed to keep those promises and could either get sued and embarrassed or just buy Altera outright for about what they were worth before the deal.
> Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
So selling FPGA's was a bad move? Or was the purchase price just wildly out-of-line with the--checking...$9.8B annual market that's expected to rise to $23.3B by 2030?
Intel can't even act as a functional foundry for internal customers.
Do you think AMD's decision to buy Xilinx was any better or not?
Perhaps we can say it was less of a distraction for AMD, given AMD is not having the basic execution issue that Intel is currently suffering.
And less disastrous for Xilinx, given they could basically just keep going as they were before, instead of being significantly diverted onto a sinking ship of a process.
If AMD did the same thing years later, was it really that foolish?
Yes, because AMD is fabless, and Xilinx didn't suddenly have to figure out how to work around Intel's production problems.