Why the original Macintosh had a screen resolution of 512×324

12 days ago (512pixels.net)

The article didn't nail down an exact reason. Here is my guess. The quote from Andy Hertzfeld suggests the limiting factor was the memory bandwidth not the memory volume:

> The most important decision was admitting that the software would never fit into 64K of memory and going with a full 16-bit memory bus, requiring 16 RAM chips instead of 8. The extra memory bandwidth allowed him to double the display resolution, going to dimensions of 512 by 342 instead of 384 by 256

If you look at the specs for the machine, you see that during an active scan line, the video is using exactly half of the available memory bandwidth, with the CPU able to use the other half (during horizontal and vertical blanking periods the CPU can use the entire memory bandwidth)[1]. That dictated the scanline duration.

If the computer had any more scan lines, something would have had to give, as every nanosecond was already accounted for[2]. The refresh rate would have to be lower, or the blanking periods would have had to been shorter, or the memory bandwidth would have to be higher, or the memory bandwidth would have had to be divided unevenly between the CPU and video which was probably harder to implement. I don't know which of those things they would have been able to adjust and which were hard requirements of the hardware they could find, but I'm guessing that they couldn't do 384 scan lines given the memory bandwidth of the RAM chips, and the blanking times of the CRT they selected, if they wanted to hit 60Hz.

[1]https://archive.org/details/Guide_to_the_Macintosh_Family_Ha...

[2]https://archive.org/details/Guide_to_the_Macintosh_Family_Ha...

  • A lot of those old machines had clock speeds and video pixel rates that meshed together. On some color machines the system clock was an integer multiple of the standard colorburst frequency.

    The Timex Sinclair did all of its computation during the blanking interval which is why it was so dog slow.

    • The Commodore Amigas had their 68k clock speed differ based on region due to carrier frequency difference (more specifically, 2x freq for NTSC, 1.6x for PAL, which resulted in almost the same, but not quite, clock speed).

      It's interesting how the differing vertical resolutions between these two (200p /400i vs 256p /512i) also had some secondary effects on software design, it was always easy to tell if a game was made in NTSC regions or with global releases in mind because the bottom 20% of the screen was black in PAL.

  • Displays are still bandwidth killers today, we kept scaling them up with everything else. Today you might have a 4k 30bpp 144hz display and just keeping that fed takes 33Gbit/s purely for scanout, not even composing it.

    • I have a 4k 60Hz monitor connected to my laptop over one USB-C cable for data and power, but because of bandwidth limitations my options are 4k30 and USB 3.x support or 4k60 and USB 2.0.

      I love the monitor, it's sharp and clear and almost kind of HDR a lot of the time, but the fact that it has a bunch of USB 3.0 ports that only get USB 2.0 speeds because I don't want choppy 30Hz gaming is just... weird.

      5 replies →

    • 4k jumped the gun. It’s just too many pixels and too many cycles. And unfortunately was introduced when pixel shaders starting doing more work.

      Consequently almost nothing actually renders at 4k. It’s all upscaling - or even worse your display is wired to double up on inputs.

      Once we can comfortably get 60 FPS, 1080p, 4x msaa, no upscaling, then let’s revisit this 4k idea.

      5 replies →

    • We see this in embedded systems all the time too.

      It doesn't help if your crossbar memory interconnect only has static priorities.

    • And marketing said, when LCDs were pushing CRT out of the market, that you don't need to send the whole image to change a pixel on an LCD, you can change only that pixel.

      4 replies →

  • It's also interesting to look at other architectures at the time to get an idea of how fiendish a problem this is. At this time, Commodore, Nintendo, and some others, had dedicated silicon for video rendering. This frees the CPU from having to generate a video signal directly, using a fraction of those cycles to talk to the video subsystem instead. The major drawback with a video chip of some kind is of course cost (custom fabrication, part count), which clearly the Macintosh team was trying to keep as low as possible.

    • Both the key 8-bit contenders of yore, Atari 8-bit series and Commodore 64 custom graphics chips (Antic and Vic-II) “stole” cycles from the 6502 (or 6510 in the case of C64) did "cycle stealing", when it needed to access memory.

      I remember writing a cpu intensive code on the Atari and using video blanking to speed up the code.

    • And yet despite the lower parts count the Macintosh was more expensive than competing products from Commodore and Atari that had dedicated silicon for video rendering. I guess Apple must have had huge gross margins on hardware sales given how little was in the box.

      1 reply →

  • Exactly. Like the Apple ][, the original Mac framebuffer was set up with alternating accesses, relying on the framebuffer reads to manage DRAM refresh.

    It looks like DRAM was set up on a 6-CPU-cycle period, as 512 bits (32 16-bit bus accesses) x 342 lines x 60 Hz x 6 cycles x 2 gives 7.87968 MHz, which is just slightly faster than the nominal 7.83 MHz, the remaining .6% presumably being spent during vblank.

  • Why did they need 60hz? Why not 50 like Europe? Is there some massive advantage to syncing with the ac frequency of the local power grid?

    • If you’re used to seeing 60Hz everywhere like Americans are 50Hz stands out like a sore thumb.

      But mostly I suspect it’s just far easier.

    • Conventional wisdom a few years after the Macintosh was that 50Hz was annoyingly flickery. Obviously this depends on your phosphors. Maybe it was already conventional wisdom at the time?

      I feel like the extra 16% of screen real estate would have been worth it.

      2 replies →

The title is incorrect, because b&w Macs have 512×342 resolution, not 512x324.

It wouldn't've been too crazy had Apple went with 64K x 4 chips, so they'd've just needed four of them to get 128 KB at a full 16 bits wide.

512x342 was 16.7% of 128 KB of memory, as opposed to 18.75% with 512x384. Not much of a difference. But having square pixels is nice.

The answer is something that's harder and harder to do these days with all the layers of abstraction -- set a performance target and use arithmetic to arrive at the specifications that you hit and still achieve your performance goal.

It's a bit of work, but I suspect you can arithmetic your way through the problem. Supposing they wanted 60 Hz on the display and a framebuffer you need 196,608 bits/24,576 bytes/24 kbytes [below] on a 1-bit display at 512x384.

The Mac 128k shipped with a Motorola 68k at 7.8336 Mhz giving it 130560 Hz per frame @ 60 fps.

IIR the word length of the 68k is 32bits, so imagining a scenario where the screen was plotted in words, it's something like 20 cycles per fetch [1], you can get about 6528 fetches per frame. At 32-bits a fetch, you need 6144 or so fetches from memory to fill the screen. You need a moment for horizontal refresh so you lose time waiting for that, thus 6528-6144 = (drumroll) 384, the number of horizontal lines on a display.

I'm obviously hitting the wavetops here, and missing lots of details. But my point is that it's calculable with enough information, which is how engineers of yor used to spec things out.

1 - https://wiki.neogeodev.org/index.php?title=68k_instructions_...

below - why bits? The original Mac used 1-bit display, meaning each pixel used 1-bit to set it as either on or off. Because it didn't need 3 subpixels to produce color, the display was tighter and sharper than color displays, and even at the lower resolution appeared somewhat paperlike. The article is correct that the DPI was around 72. Another way to think about it, and what the Mac was targeting was pre-press desktop publishing. Many printing houses could print at around 150-200 lines per inch. Houses with very good equipment could hit 300 or more. Different measures, but the Mac, being positioned as a WYSISWYG tool, did a good job of approximating analog printing equipment of the time. (source: grew up in a family printing business)

  • Motorola 68000 used had 16 data lines and 24 address lines, so it took at least two cycles to just transfer a CPU full word (disregarding timings on address latches etc).

    Some of the code AFAIK used fancy multi-register copies to increase cycle efficiency in graphics code.

    As for screen, IIRC making it easy to correlate "what's on screen" and "what's on paper" was major part of what drove Mac to be nearly synonymous with DTP for years.

  • In typography there are 72 points per inch so they made 1 pixel = 1 point.

Impressed to see, how many people read whole article, not see just one phrase: "We don’t need a lot of the things that other personal computers have, so let’s optimize a few areas and make sure the software is designed around them".

Mac was not cheap machine and Apple that time was not rich to make unnecessary thing - they really need to make a hit this time, and they succeed.

And yes, it is true, they was limited by bandwidth, it is also true they was limited by semi-32bit CPU speed.

But Mac was real step ahead at the moment, and had significant resources to grow when new technology will arrive. That what I think lack PCs of that time.

The article really didn’t explain why they picked that number.

  • I don't know, but I can do some numerology: a 3:2 aspect ratio that's 512 pixels wide would need a 341 and a third lines, so round up and you get 512 by 342.

    The later 384 number corresponds to an exact 4:3 aspect ratio.

  • For efficient graphics routines on a 32 bit machine, it's important that the scan line direction (aka horizontal for normally mounted CRT's) be a factor of 32, preferably one that's a power of 2.

    The article mentions the desire for square pixels. So presumably they chose the horizontal resolution first and then chose the vertical resolution that gave them square pixels for a 512 pixel horizontal resolution.

  • The article says: In short, there’s no easy answer to explain why early compact Macs ran at a screen resolution of 512×342. Rather, Apple was doing what it does best: designing a product with the right trade-offs for performance, ease of use, and cost.

It doesn't say exactly why 512x342 was chosen. But I'm more interested in why it was changed to 512x384 on later Macs. Is it just to fill the full 4:3 screen?

Beyond that, this article really wants to tell you how amazing that resolution was in 1984. Never mind that you could get an IBM XT clone with "budget" 720x348 monochrome Hercules graphics that year and earlier.

  • The 512x384 models are Macintosh LC adjacent, so the original LC monitor (the LC itself can do 640x480), or the Colour Classics. AFAIK it was partly in order to making the LC work better with the Apple IIe card (although the IIe software uses a 560x384 mode).

    A Hercules card, whilst nice does suffer from the same non-square pixels issue as the Lisa, so not as nice for creating a GUI.

    • > although the IIe software uses a 560x384 mode

      Nice, that's line doubled from the //e's 560x192 and would probably look crisp.

  • Both MDA and Hercules were 50 Hz. Real mid eighties king of cheap crisp displays would be 12 inch 640x400@71Hz Atari SM124 monitor. You could buy Atari ST + SM124 + Atari SLM804 laser printer + Calamus DTP package for the price of just the Apple laser printer alone :)

    • I had a XT clone + Hercules at the time (and SIMCGA for games), and the 50Hz refresh wasn't as bad as you'd think - the MDA CRTs were designed with slow decay phosphors to reduce flicker.

      I actually had no idea that Atari made a laser printer. Everyone I knew with a ST (admittedly, not many people) was either doing MIDI or playing video games.

I always assumed it was a compromise between memory usage, refresh speed, and the GUI that they wanted. Don't forget that the Macintosh was preceded by the Lisa (800x364) and the IIGS (640x200), so they probably had a good sense for what was comfortable given a certain resolution.

The folklore link they reference: https://www.folklore.org/Five_Different_Macs.html

The 1st edition of macworld, notably the first page is an advert for microsoft's products, multiplan spreadsheet, word, etc. https://archive.org/details/MacWorld_8404_April_1984_premier...

The original floppy used on the mac was a single-sided 400KB disk. I imagine that was another set of trade-offs. https://folklore.org/Disk_Swappers_Elbow.html

I remember in the early '80s using a computer (a Xerox Star, perhaps?) that used the CPU to generate the display. To speed up CPU-intensive tasks, you could blank the screen.

  • Alto had its entire display control in microcode, IIRC.

    Out of similar tricks, Symbolics 3600 (at least first model) had major portions of disk driver implemented as one of the tasks in microcode (yes, the microcode was a multi-tasking system with preemption). Don't know how much of MFM wrangling was going there, but ultimately it meant that reading and writing a page from/to disk was done by means of single high level instruction

CRTs are very forgiving- 512x342 vs x384 would have made very little difference. You could still get square pixels by minor adjustments to vertical and horizontal size.

My question is what is the htotal and vtotal times in pixels and lines? Maybe there was a hardware savings to have vtotal exactly equal to 384 (which is 128 times 3). Perhaps they saved one bit in a counter, which may have resulted in one fewer TTL chip.

> but given the name of this website, it was pretty embarrassing.

Why, the name of the website is 512pixels.net not 342pixels.net; he nailed the 512 dimension. :)

> “To minimize CRT flicker, Apple worked to achieve a vertical refresh rate of 60 Hz”

… a limitation that many Macs, and even some iPhones, are still stuck with over 40 years later!

  • It's always surprising for me to see people regard 60 Hz CRT as "flicker-free", or "minimal flicker", etc. Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).

    • Have you ever seen something running at 30 Hz? Or even 15? The difference in flicker between 30 and 60 is much much larger than the difference between 60 and 120! Yeah 60 isn't flicker free, any finite number is not (there is probably quantum limits), but realistically you reach a point where you can't really tell. For most purposes 60Hz is close enough, though you can still tell.

      8 replies →

    • I have recently been playing with CRTs again, and something that I have noticed is that for fast-paced games running at 60 or 70 Hz* I don't notice the flicker much, but for text anything less than 85 Hz is headache inducing. Luckily the monitor I got can do 1024x768 at 100 Hz :)

      * The original VGA and thus most MS-DOS games ran at 70 Hz.

      3 replies →

    • Monochrome CRT phosphors like P4 (zinc sulfide w silver) have longer persistence than ones used in color CRTs, so flicker is less noticeable.

    • >Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).

      Same, I remember installing some program that would let you quickly change the display settings on basically every computer I ever interacted with. It was especially bad if the crt was in a room with fluorescent lighting.

      2 replies →

    • Me too. I'm also really sensitive to PWM. I tried using 85Hz on my VGA monitor but the higher signal bandwidth and cheap hardware made the video noticeably blurrier. 70 wasn't a great compromise either.

      Since TFTs came I was bothered a lot less by it because the lack of flicker (though some 4 bit cheap TN LCDs still had it with some colours)

  • But there is less need because LCDs do not flicker (except some designed for videogames that strobe the backlight for some strange reason IIUC).

    I know I found the flicker of CRTs annoying even at 60 Hz.

    • Strobing the backlight seems like it would allow you to not illuminate the new frame of video until the liquid crystals have finished rotating, so you only have to contend with the persistence of vision on your retina instead of additionally the persistence of the liquid crystals.

      2 replies →

The answer is probably more akin to: "As small of a resolution as they could make it without Steve bitching at them about it."

  • Apple could use some Steve drama, they seem to be moving backwards lately.

Regardless of whether we go with 512x324, 512x342, or 512x384, the claim of 72 PPI (exact) and 9" of diagonal size (exact) are not simultaneously possible.

Extremely nitpicky thing I know, but this kinda stuff really bugs me, could somebody please clarify what was the real size (and/or PPI) here?

For reference:

512x324 @ 72 PPI = 8.42" (or 214 mm) (rounded)

512x342 @ 72 PPI = 8.55" (or 217 mm) (rounded)

512x384 @ 72 PPI = 8.89" (or 226 mm) (rounded)

The first two don't even yield an integer result for the number of diagonal pixels, let alone yield an integer multiple of 72. Or would there be bars around the screen, or how would this work?

  • For CRTs, the diagonal measurement was of the physical tube. The actual viewable area was smaller. Part of the tube’s edges were covered by plastic, and there was always also some margin that wasn’t used for picture so it was just black.

    It was a 9” tube with 3:2 aspect ratio. Your calculation of a 8.5” image at 72 dpi sounds right.

    • >For CRTs, the diagonal measurement was of the physical tube. The actual viewable area was smaller.

      That's also why TVs and monitors of that era always seemed smaller than advertised. I remember having to explain that to a lot of people.

  • You're quite right, the screen would be centred with margin/border around it.

    Whilst the CRT is 9", according to period repair guides the screen should be adjusted so that the visible image was 7.11" x 4.75", pretty much exactly 1.5:1. This meant 72dpi, which was to match PostScript point size for print output and WYSIWYG.

    So it's your 8.55" diagonal.

    Some classic Macintosh users today are unaware of this screen size reasoning, or don't agree with it, and stretch the screen to fill the whole CRT. Yikes!

    BTW, I posted pretty much the same info earlier today at https://news.ycombinator.com/item?id=44105531 — what synchronicity!

  • A square that's one thousand units by one thousand units doesn't give a rational number, much less an integer one, for the diagonal.

    A 9" CRT would never be precisely 9", because beam trace width and height are analog, plus there's overscan, so a 9" screen would simply give something pretty close to 9".