Comment by inkyoto

1 year ago

> they don't care […]

Precisely. Embedded cares only about one thing: «get the product off the ground and ship it fast, bugs including». And since the software in embedded is not user-facing, they can get away with «power cycle the device if it stops responding» recommendations in the user guide.

Embedded also sees the CPU as a disposable commodity and not a long term asset, and it is a well entrenched habit of throwing the entire code base away if another alternative CPU/ISA (cheaper, more power efficient etc – you name it) comes along. Where is all the code once written for 68HC11, PIC, AVR etc? Nowhere. It has all but been thrown away for varying reasons (architecture switch, architecture obsolescence and stuff). Same has not happened for Intel, and the code is still around and running.

For more substantial embedded development, the responsibility of adopting a new ISA falls on the vendor of the embedded OS/runtime (e.g. VxWorks or the embedded CPU vendor) who makes reasonable efforts of supporting hardware features important to customers but does not carry out the extensive testing of all features. Again, the focus is on allowing the vendor's customers to ship the product fast. Quality of development toolchains for embedded is also not so infrequently questionable and complaints about poor support of the underlying hardware are common. They are typically ignored.

> but for mose uses it doesn't matter […]

Which is why embedded is not a useful success metric when it comes to predicting the success of a CPU architecture in user-facing scenarios (namely, personal and server computing).