← Back to context

Comment by pragma_x

12 days ago

It's also interesting to look at other architectures at the time to get an idea of how fiendish a problem this is. At this time, Commodore, Nintendo, and some others, had dedicated silicon for video rendering. This frees the CPU from having to generate a video signal directly, using a fraction of those cycles to talk to the video subsystem instead. The major drawback with a video chip of some kind is of course cost (custom fabrication, part count), which clearly the Macintosh team was trying to keep as low as possible.

Both the key 8-bit contenders of yore, Atari 8-bit series and Commodore 64 custom graphics chips (Antic and Vic-II) “stole” cycles from the 6502 (or 6510 in the case of C64) did "cycle stealing", when it needed to access memory.

I remember writing a cpu intensive code on the Atari and using video blanking to speed up the code.

Plus those weren’t raw bitmaps but tile based to help keep memory and bandwidth costs down.

  • Not sure we're thinking the same way, but the C64 and Atari had bitmap modes, not just tile or character modes.

And yet despite the lower parts count the Macintosh was more expensive than competing products from Commodore and Atari that had dedicated silicon for video rendering. I guess Apple must have had huge gross margins on hardware sales given how little was in the box.