Bill Atkinson has died

21 hours ago (daringfireball.net)

https://facebook.com/story.php?story_fbid=10238073579963378&...

In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web, and inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove; computing devices actually become (for everyone, not just programmers) the "bicycle for the mind" that Steve Jobs spoke of. I think this is the timeline that Atkinson envisioned, and I wish I lived in it. We've lost a true visionary. Memory eternal!

  • Maybe there's some sense of longing for a tool that's similar today, but there's no way of knowing how much hypercard did have the impact you are talking about. For example many of us reading here experienced HyperCard. It planted seeds in our future endeavors.

    I remember in elementary school, I had some computer lab classes where the whole class worked in hypercard on some task. Multiply that by however many classrooms did something like that in the 80s and 90s. That's a lot of brains that can be influenced and have been.

    We can judge it as a success in its own right, even if it never entered the next paradigm or never had quite an equivalent later on.

    • HyperCard was undoubtedly the inspiration for Visual Basic, which for quite some time dominated the bespoke UI industry in the same way web frameworks do today.

      2 replies →

    • Word. This is the Papert philosophy of constructionism, learning to think by making that so many of us still carry. I’m still trying to build software-building software. We do live in that timeline; it’s just unevenly distributed.

    • HyperCard was the foundation of my programming career. I treated the HyperCard Bible like an actual Bible.

  • The Web was significantly influenced by HyperCard. Tim Berners-Lee's original prototypes envisioned it as bidirectional, with a hypertext editor shipping alongside the browser. In that sense it does live on, and serves as the basis for much of the modern Internet.

    • IIRC, the mouse pointer turning into a hand when you mouse over something clickable was original to HyperCard. And I think Brendan Eich was under a heavy influence of HyperTalk when created JavaScript.

      3 replies →

    • I honestly don't think the modern web is a legitimate hypertext system at this point. It was already bad enough 20 years ago with flash and serverside CGI but now most of the major websites are just serving JavaScript programs that then fetch data using a dedicated API. And then there's all the paywalls and constant CAPTCHA checks to make sure you aren't training an LLM off their content without a license.

      Look up hyperland, it's a early 90s documentary by Douglas Adams and the guy from doctor who about the then-future hypermedia revolution. I can remember the web resembling that a long time ago but the modern web is very far removed from anything remotely resembling hypertext.

      1 reply →

  • We kind of had that for a time with FileMaker and MS Access. People could build pretty amazing stuff with those apps, even without being a programmer.

    I think the reason those apps never became mainstream is that they didn't have a good solution for sharing data. There were some ways you could use them to access database servers, but setting them up was so difficult that they were for all intents and purposes limited to local, single user programs.

    HTML, CSS, PHP and MySQL had a learning curve, but you could easily make multi-user programs with them. That's why the web won.

    • Yes! I used FileMaker a lot, and built my first journaling system with it. Like a cross between hypercard and a wiki. It really changed my life and this lead to programming.

    • Yes. They didn't survive the transition from workgroup (shared files on a LAN) to client/server.

  • Mr. Atkinson's passing was sad enough without thinking about this.

    (More seriously: I can still recall using ResEdit to hack a custom FONT resource into a HyperCard stack, then using string manipulation in a text field to create tiled graphics. This performed much better than button icons or any other approach I could find. And then it stopped working in System 7.)

  • It’s ironic that the next graphical programming environment similar to Hypercard was probably Flash - and it obviously died too.

    What actually are the best successors now, at least for authoring generic apps for the open web? (Other than vibe coding things)

    • - Minecraft - Roblox - LittleBigPlanet - Mario Maker

      This is what kids do to be creative.

      Slightly more serious (and therefore less succesful): - Logo/Turtle Graphics - Scratch - HyperStudio

      HyperCard was both graphic design and hypertext (links). These two modalities got separated, and I think there are practical reasons for that. Because html/css design actually sucks and never became an amateur art form.

      For writing and publishing we got Wiki, Obsidian et al, Blogs (RIP), forums, social media. Not meant to be interactive or programmable, but these fulfill people's needs for publishing.

      1 reply →

    • Pretty sure the next after Hypercard was Macromind (later Macromedia) Director. I recall running an early version of a Director animation on a black and white Mac not long after I started playing with Hypercard. Later I was a Director developer. I recall when Future Splash released -- the fast scaling vector graphics were a new and impressive thing. The web browser plugin helped a lot and it really brought multimedia to the browser. It was only later that Macromedia acquired Future Splash and renamed it Flash.

    • Flash completely missed the most important point of HyperCard, which was that end users could put it into edit mode, explore the source code, learn from it, extend it, copy parts of it out, and build their own user interfaces with it.

      It's not just "View Source", but "Edit Source" with a built-in, easy to use, scriptable, graphical, interactive WYSIWYG editor that anyone can use.

      HyperCard did all that and more long before the web existed, was fully scriptable years before JavaScript existed, was extensible with plug-in XCMDs long before COM/OLE/ActiveX or even OpenDoc/CyberDog or Java/HotJava/Applets, and was widely available and embraced by millions of end-users, was used for games, storytelling, art, business, personal productivity, app development, education, publishing, porn, and so much more, way before merely static web page WYSIWYG editors (let alone live interactive scriptable extensible web application editors) ever existed.

      LiveCard (HyperCard as a live HTTP web app server back-end via WebStar/MacHTTP) was probably the first tool that made it possible to create live web pages with graphics and forms with an interactive WYSIWYG editor that even kids could use to publish live HyperCard apps, databases, and clickable graphics on the web.

      HyperCard deeply inspired HyperLook for NeWS, which was scripted, drawn, and modeled with PostScript, that I used to port SimCity to Unix:

      Alan Kay on “Should web browsers have stuck to being document viewers?” and a discussion of Smalltalk, HyperCard, NeWS, and HyperLook

      >It had an AppleScript / OSA API that let you write handlers for responding to web hits in other languages that supported AppleScript.

      I used it to integrate ScriptX with the web:

      http://www.art.net/~hopkins/Don/lang/scriptx/scriptx-www.htm...

      https://medium.com/@donhopkins/1995-apple-world-wide-develop...

      The coolest thing somebody did with WebStar was to integrate it with HyperCard so you could actually publish live INTERACTIVE HyperCard stacks on the web, that you could see as images you could click on to follow links, and followed by html form elements corresponding to the text fields, radio buttons, checkboxes, drop down menus, scrolling lists, etc in the HyperCard stack that you could use in the browser to interactive with live HyperCard pages!

      That was the earliest easiest way that non-programmers and even kids could both not just create graphical web pages, but publish live interactive apps on the web!

      Using HyperCard as a CGI application

      https://web.archive.org/web/20060205023024/http://aaa-protei...

      https://web.archive.org/web/20021013161709/http://pfhyper.co...

      http://www.drdobbs.com/web-development/cgi-and-applescript/1...

      "In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web..."

      In yet another alternate timeline, someone thought to add something like URLs with something like GET, PUT, etc. to HyperCard, and Tim Berners-Lee's invention of the Web browser never happened because Hypercard already did it all.

      • On one hand this would be simply amazing, on the other hand it would have been a total security nightmare that makes early Javascript look like a TPM Secure Enclave.

        1 reply →

    • Not sure that sculpting clay is the best analogy. Lots of sculpting is hard, as is turning clay, especially if you want to successfully fire the result. Maybe it is an accurate analogy, but people may understand the difficulty differently.

      • Hypercard is more like Lego - you can simply buy completed sets (use other's hypercard programs) - or you can put together things according to instructions - but you can always take them apart and change them, and eventually build your own.

    • I haven't posted it here yet b/c it's not show ready, but we have been building this vision -- I like to think of it as an e-bike for the mind.

      https://vibes.diy/

      We had a lot of fun last night with Vibecode Karaoke, where you code an app at the same time as you sing a song.

    • https://news.ycombinator.com/item?id=22285675

      DonHopkins on Feb 10, 2020 | parent | context | favorite | on: HyperCard: What Could Have Been (2002)

      Do you have the first commercial HyperCard stack ever released: the HyperCard SmutStack? Or SmutStack II, the Carnal Knowledge Navigator, both by Chuck Farnham? SmutStack was the first commercial HyperCard product available at rollout, released two weeks before HyperCard went public at a MacWorld Expo, cost $15, and made a lot of money (according to Chuck). SmutStack 2, the Carnal Knowledge Navigator, had every type of sexual adventure you could imagine in it, including information about gays, lesbians, transgendered, HIV, safer sex, etc. Chuck was also the marketing guy for Mac Playmate, which got him on Geraldo, and sued by Playboy.

      https://www.zdnet.com/article/could-the-ios-app-be-the-21st-...

      >Smut Stack. One of the first commercial stacks available at the launch of HyperCard was Smut Stack, a hilarious collection (if you were in sixth grade) of somewhat naughty images that would make joke, present a popup image, or a fart sound when the viewer clicked on them. The author was Chuck Farnham of Chuck's Weird World fame.

      >How did he do it? After all, HyperCard was a major secret down at Cupertino, even at that time before the wall of silence went up around Apple.

      >It seems that Farnham was walking around the San Jose flea market in the spring of 1987 and spotted a couple of used Macs for sale. He was told that they were broken. Carting them home, he got them running and discovered several early builds of HyperCard as well as its programming environment. Fooling around with the program, he was able to build the Smut Stack, which sold out at the Boston Macworld Expo, being one of the only commercial stacks available at the show.

      https://archive.org/stream/MacWorld_9008_August_1990/MacWorl...

      Page 69 of https://archive.org/stream/MacWorld_9008_August_1990

      >Famham's Choice

      >This staunch defender was none other than Chuck Farnham, whom readers of this column will remember as the self-appointed gadfly known for rooting around in Apple’s trash cans. One of Farnham ’s myriad enterprises is Digital Deviations, whose products include the infamous SmutStack, the Carnal Knowledge Navigator, and the multiple-disk set Sounds of Susan. The last comes in two versions: a $15 disk of generic sex noises and, for $10 more, a personalized version in which the talented Susan moans and groans using your name. I am not making this up.

      >Farnham is frank about his participation in the Macintosh smut trade. “The problem with porno is generic,” he says, sounding for the briefest moment like Oliver Wendell Holmes. “When you do it, you have to make a commitment ... say you did it and say it’s yours. Most people would not stand up in front of God and country and say, ‘It’s mine.’ I don’t mind being called Mr. Scum Bag.”

      >On the other hand, he admits cheerily, “There’s a huge market for sex stuff.” This despite the lack of true eroticism. “It’s a novelty,” says Farnham. Sort of the software equivalent of those ballpoint pens with the picture of a woman with a disappearing bikini.

      https://archive.org/stream/NewComputerExpress110/NewComputer...

      Page 18 of https://archive.org/stream/NewComputerExpress110

      >“Chuck developed the first commercial stack, the Smutstack, which was released two weeks before HyperCard went public at a MacWorld Expo. He’s embarrassed how much money a silly collection of sounds, cartoons, and scans of naked women brought in. His later version, the Carnal Knowledge Navigator, was also a hit.

      I've begged Chuck to dig around to see if he has an old copy of the floppy lying around and upload it, but so far I don't know of a copy online you can run. Its bold pioneering balance of art and slease deserves preservation, and the story behind it is hilarious.

      Edit: OMG I've just found the Geraldo episode with Chuck online, auspiciously titled "Geraldo: Sex in the 90's. From Computer Porn to Fax Foxes", which shows an example of Smut Stack:

      https://visual-icon.com/lionsgate/detail/?id=67563&t=ts

      I love the way Chuck holds his smirk throughout the entire interview. And Geraldo's reply to his comment: "I was a fulfillment house for orders."

      "That sounds sexual in itself! What was a fulfilment house?"

    • I actually had an experience like this yesterday. After reading Gruber talk about how Markdown was never meant for notes, I started to rethink things. I wanted plain text, to be future proof, then stumbled across CotEditor as a means to edit. Inside I was able to use the code highlighting and outline config to define my own regex and effectively create my own markup language with just a dash of regex and nothing more. I then jumped over to Shortcuts and dragged and dropped some stuff together to open/create yearly and daily notes (on either my computer or phone), or append to a log with a quick action.

      It is a custom system that didn’t require any code (if you don’t count the very minor bits of regex (just a lot of stuff like… ^\s- .).

      Is it a good system, probably not, but we’ll see where it goes.

    • inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove

      LLMs inspired vibe coding - that’s our timeline.

    When I was on the ColorSync team at Apple we, the engineers, got an invite to his place-in-the-woods one day.

    I knew who he was at the time, but for some reason I felt I was more or less beholden to conversing only about color-related issues and how they applied to a computer workflow. Having retired, I have been kicking myself for some time not just chatting with him about ... whatever.

    He was at the time I met him very in to a kind of digital photography. My recollection was that he had a high-end drum scanner and was in fact scanning film negatives (medium format camera?) and then going with a digital workflow from that point on. I remember he was excited about the way that "darks" could be captured (with the scanner?). A straight analog workflow would, according to him, cause the darks to roll off (guessing the film was not the culprit then, perhaps the analog printing process).

    He excitedly showed us on his computer photos he took along the Pacific ocean of large rock outcroppings against the ocean — pointing out the detail that you could see in the shadow of the rocks. He was putting together a coffee table book of his photos at the time.

    I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer. I think I was weighing his "technical" approach to photography vs. a strictly artistic one. Although, having learned more about Ansel Adams technical chops, perhaps for the best photographers there is overlap.

    • > I have been kicking myself for some time not just chatting with him about ... whatever.

      Maybe I should show some initiative! See, for a little while now I've wanted to just chat with you about whatever.

      At this moment I'm working on a little research project about the advent of color on the Macintosh, specifically the color picker. Would you be interested in a casual convo that touches on that? If so, I can create a BlueSky account and reach out to you over there. :)

      https://merveilles.town/deck/@rezmason/114586460712518867

      • John is cool, but I don't think he was around when the Macintosh II software and hardware was being designed for color support. I did work with Eric Ringewald at Be and he was one of the Color Quickdraw engineers. He would be fun to talk to. Michael Dhuey worked on the hardware of the Mac II platform. I guess we can give some credit to Jean-Louis Gassée as well. Try to talk to those people! I got to work with a lot of these Apple legends at General Magic, Be, Eazel and then back at Apple again. I never got to work on a project with JKCalhoun directly, but I did walk by his office quite frequently.

        2 replies →

    • Bill showed up at one of the WWDCs (2011?). I sat next to him during a lunch, not knowing who he was! He told me his name, and then showed me some photos he had taken. He seemed to me to be a gentle and kind soul. So sad to read this news.

    • There probably still isn't a good way to get that kind of dynamic range entirely in the digital domain. Oh, I'm sure the shortfall today is smaller, say maybe four or five stops versus probably eight or twelve back then. Nonetheless, I've done enough work in monochrome to recognize an occasional need to work around the same limitations he was, even though very few of my subjects are as demanding.

      • I wish a good monochrome digital camera didn't cost a small fortune. And I'm too scared to try to remove the Bayer grid from a "color" CCD.

        Seems that, without the color/Bayer thing, you could get an extra stop or two for low-light.

        I had a crazy notion to make a camera around an astronomical CCD (often monochrome) but they're not cheap either — at least one with a good pixel count.

        5 replies →

    • :) Color in the computer is a good “whatever” topic.

      Sometimes it’s just nice to talk about the progress of humanity. Nothing better than being a part of it, the gears that make the world turn.

      • Ha ha, but it's also "talking shop". I'm sure Bill preferred it to talking about his Quickdraw days.

    • You always lose something when doing optical printing - you can often gain things too, but its not 1:1.

      I adore this hybrid workflow, because I can pick how the photo will look, color palate, grain, whatever by picking my film, then I can use digital to fix (most if not all of) the inherent limitations in analog film.

      Sadly, film is too much of a pain today, photography has long been about composition for me, not cameras or process - I liked film because I got a consistent result, but I can use digital too, and I do today.

    • "When art critics get together they talk about form and structure and meaning. When artists get together they talk about where you can buy cheap turpentine."

    • > I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer.

      Duchamp would like a word.

      Seriously though, as someone this describes to a T (though “suddenly” in this case is about 19 years), I was afraid to call myself any sort of artist for well over a decade, thinking I was just acquiring signal with high end gear. I didn’t want to try to present myself as something I’m not. After all, I just push the button, the camera does all the work.

      I now have come to realize that this attitude is toxic and unnecessary. Art (even bad art!) doesn’t need more gatekeeping or gatekeepers.

      I am a visual artist. A visual artist with perhaps better equipment than my skill level or talent justifies, but a visual artist nonetheless.

    • > I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer

      I think this says more about you than it does about him

    From Walter Isaacson's _Steve Jobs_:

    > One of Bill Atkinson’s amazing feats (which we are so accustomed to nowadays that we rarely marvel at it) was to allow the windows on a screen to overlap so that the “top” one clipped into the ones “below” it. Atkinson made it possible to move these windows around, just like shuffling papers on a desk, with those below becoming visible or hidden as you moved the top ones. Of course, on a computer screen there are no layers of pixels underneath the pixels that you see, so there are no windows actually lurking underneath the ones that appear to be on top. To create the illusion of overlapping windows requires complex coding that involves what are called “regions.” Atkinson pushed himself to make this trick work because he thought he had seen this capability during his visit to Xerox PARC. In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so. “I got a feeling for the empowering aspect of naïveté”, Atkinson said. “Because I didn’t know it couldn’t be done, I was enabled to do it.” He was working so hard that one morning, in a daze, he drove his Corvette into a parked truck and nearly killed himself. Jobs immediately drove to the hospital to see him. “We were pretty worried about you”, he said when Atkinson regained consciousness. Atkinson gave him a pained smile and replied, “Don’t worry, I still remember regions.”

    • With overlapping rectangular windows (slightly simpler case than ones with rounded corners) you can expect visible regions of windows that are not foremost to be, for example, perhaps "L" shaped, perhaps "T" shaped (if there are many windows and they overlap left and right edges). Bill's region structure was, as I understand it, more or less a RLE (run-length encoded) representation of the visible rows of a window's bounds. The region for the topmost window (not occluded in any way) would indicate the top row as running from 0 to width-of-window (or right edge of the display if clipped by the display). I believe too there was a shortcut to indicate "oh, and the following rows are identical" so that an un-occluded rectangular window would have a pretty compact region representation.

      Windows partly obscured would have rows that may not begin at 0, may not continue to width-of-window. Window regions could even have holes if a skinnier window was on top and within the width of the larger background window.

      The cleverness, I think, was then to write fast routines to add, subtract, intersect, and union regions, and rectangles of this structure. Never mind quickly traversing them, clipping to them, etc.

      • The QuickDraw source code refers to the contents of the Region structure as an "unpacked array of sorted inversion points". It's a little short on details, but you can sort of get a sense of how it works by looking at the implementation of PtInRgn(Point, RegionHandle):

        https://github.com/historicalsource/supermario/blob/9dd3c4be...

        As far as I can tell, it's a bounding box (in typical L/T/R/B format), followed by a sequence of the X/Y coordinates of every "corner" inside the region. It's fairly compact for most region shapes which arise from overlapping rectangular windows, and very fast to perform hit tests on.

        1 reply →

      • The key seems to have been recognizing the utility of the region concept and making it fundamental to the QuickDraw API (and the clever representation that made finding the main rectangular portions easy). This insulated QuickDraw from the complexity of windowing system operations. Once you go implementing region operations you probably find that it's fairly efficient to work out the major rectangular regions so you can use normal graphics operations on them, leaving small areas that can just be done inefficiently as a bunch of tiny rectangles. All this work for clipped graphics was applicable to far more than just redrawing obscured window content, so it could justify more engineering time to polishing it. Given how easy they were to use, more things could leverage the optimization (e.g. using them to redraw only the dirty region when a window was uncovered).

    • I think the difference between the Apple and Xerox approach may be more complicated than the people at PARC not knowing how to do this. The Alto doesn't have a framebuffer, each window has its own buffer and the microcode walks the windows to work out what to put on each scanline.

      • Not doubting that, but what is the substantive difference here? Does the fact that there is a screen buffer on the Mac facilitate clipping that is otherwise not possible on the Alto?

        8 replies →

      • Frame buffer memory was still incredibly expensive in 1980. Our labs 512 x 512 x 8bit table lookup color buffer cost $30,000 in 1980. Mac's 512 x 384 x 8bit buffer in 1984 had to fit the Macs $2500 price. The Xerox Alto was earlier than these two devices and would have cost even more if it had a full frame buffer.

        2 replies →

      • Reminds me of a GPU's general workflow. (like the sibling comment, 'isn't that the obvious way this is done'? Different drawing areas being hit by 'firmware' / 'software' renderers?)

    • Would someone mind explaining the technical aspect here? I feel with modern compute and OS paradigms I can’t appreciate this. But even now I know that feeling when you crack it and the thrill of getting the imposible to work.

      It’s on all of us to keep the history of this field alive and honor the people who made it all possible. So if anyone would nerd out on this, I’d love to be able to remember him that way.

      (I did read this https://www.folklore.org/I_Still_Remember_Regions.html but might be not understanding it fully)

      • There were far fewer abstraction layers than today. Today when your desktop application draws something, it gets drawn into a context (a "buffer") which holds the picture of the whole window. Then the window manager / compositor simply paints all the windows on the screen, one on top of the other, in the correct priority (I'm simplifying a lot, but just to get the idea). So when you are programing your application, you don't care about other applications on the screen; you just draw the contents of your window and that's done.

        Back at the time, there wouldn't be enough memory to hold a copy of the full contents all possible windows. In fact, there were actually zero abstraction layers: each application was responsible to draw itself directly into the framebuffer (array of pixels), into its correct position. So how to handle overlapping windows? How could each application draw itself on the screen, but only on the pixels not covered by other windows?

        QuickDraw (the graphics API written by Atkinson) contained this data structure called "region" which basically represent a "set of pixels", like a mask. And QuickDraw drawing primitives (eg: text) supported clipping to a region. So each application had a region instance representing all visible pixels of the window at any given time; the application would then clip all its drawing to the region, so that only the visibile pixels would get updated.

        But how was the region implemented? Obviously it could have not been a mask of pixels (as in, a bitmask) as it would use too much RAM and would be slow to update. In fact, think that the region datastructure had to be quick at doing also operations like intersections, unions, etc. as the operating system had to update the regions for each window as windows got dragged around by the mouse.

        So the region was implemented as a bounding box plus a list of visible horizontal spans (I think, I don't know exactly the details). When you represent a list of spans, a common hack is to use simply a list of coordinates that represent the coordinates at which the "state" switches between "inside the span" to "outside the span". This approach makes it for some nice tricks when doing operations like intersections.

        Hope this answers the question. I'm fuzzy on many details so there might be several mistakes in this comment (and I apologize in advance) but the overall answer should be good enough to highlight the differences compared to what computers to today.

        4 replies →

    • > In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so.

      Reminds me of the story where some company was making a new VGA card, and it was rumored a rival company had implemented a buffer of some sort in their card. When both cards came out the rival had either not actually implemented it or implemented a far simpler solution

      • An infamous Starcraft example also contains notes of a similar story where they were so humbled by a competitor's demo (and criticism that their own game was simply "Warcraft in space") that they went back and significantly overhauled their game.

        Former Ion Storm employees later revealed that Dominion’s E3 1996 demo was pre-rendered, with actors pretending to play, not live gameplay.

        4 replies →

      • Michael Abrash's black book of graphics programming. They heard about a "buffer", so implemented the only non-stupid thing - a write FIFO. Turns out the competition had done the most stupid thing and built a read buffer.

        I teach this lesson to my mentees. Knowing that something is possible gives you significant information. Also, don't brag - It gives away significant information.

        Just knowing something is possible makes it much, much easier to achieve.

        https://valvedev.info/archives/abrash/abrash.pdf

    • Pretty awesome story, but also with a bit of dark lining. Of course any owner, and triple that for Jobs, loves over-competent guys who work themselves to the death, here almost literally.

      But that's not a recipe for personal happiness for most people, and most of us would not end up contributing revolutionary improvements even if done so. World needs awesome workers, and we also need ie awesome parents or just happy balanced content people (or at least some part of those).

      • Pretty much. Most of us have creative itches to scratch that make us a bit miserable if we never get to pursue them, even if given a comfortable life. It’s circumstantial whether we get to pursue them as entrepreneurs or employees. The users or enjoyers of our work benefit either way.

        2 replies →

      • Survivorship bias. The guys going home at 5 went home at 5 and their companies are not written about. It’s dark but we’ve been competing for a while as life forms and this is “dark-lite” compared to what our previous generations had to do.

        Some people are competing, and need to make things happen that can’t be done when you check out at 5. Or more generally: the behaviour that achieves the best outcome for a given time and place, is what succeeds and forms the legends of those companies.

        If you choose one path, know your competitors are testing the other paths. You succeed or fail partly based on what your most extreme competitors are willing to do, sometimes with some filters for legality and morality. (I.e. not universally true for all countries or times.)

        Edit: I currently go home at 5, but have also been the person who actually won the has-no-life award. It’s a continuum, and is context specific. Both are right and sometimes one is necessary.

      • What is the dark lining? Do you think Atkinson did not feel totally satisfied with his labour?

        And I don't think anyone said that that's the only way to be

    Bill's contribution with HyperCard is of course legendary. Apart from the experience of classrooms and computer labs in elementary schools, it was also the primary software powering a fusion of bridge-simulator-meets-live-action-drama field trips (among many other things) for over 20 years at the Space Center in central Utah.[0] I was one of many beneficiaries of this program as a participant, volunteer, and staff member. It was among the best things I've ever done.

    That seed crystal of software shaped hundreds of thousands of students that to this day continue to rave about this program (although the last bits of HyperCard retired permanently about 12 years ago, nowadays it's primarily web based tech).

    HyperCard's impact on teaching students to program starship simulators, and then telling compelling, interactive, immersive, multi-player dramatic stories in those ships is something enabled by Atkinson's dream in 1985.

    May your consciousness journey between infinite pools of light, Bill.

    Also, if you've read this far, go donate to Pancreatic Cancer research.[1]

    [0]: https://spacecenter.alpineschools.org [1]: https://pancan.org

    I first met Bill over video-chat during 2020 and we got to know each other a bit. He later sent me a gift that changed my life. We hadn't talked for the past couple years, but I know he experienced "death" before and was as psychologically prepared as anyone could be. I have no doubt that he handled the biggest trip of his life with grace. We didn't always see eye-to-eye when it came to software, but we did share a mutual interest in the unknown, and the meaning of it all. Meet ya on the other side, Bill.

    If you haven't, check out the documentary[0] on General Magic which Bill co-founded in 1990. Among the more remarkable scenes in there is when a member of the public seems perplexed by the thought that they would even want to "check email from Times Square."

    An unthinkable future, but they thought it. And yet, most folks have never heard of General Magic.

    0. https://www.youtube.com/watch?v=JQymn5flcek

    I never met Bill, and he never knew I existed, but he has had such a huge impact on my career, my family and my prosperity. I started my programming passion on the Apple II and switch to the Mac in 1984 after seeing MacPaint. Hypercard was very impactful on my logical thinking, paraded the incredibility of possibilities from this machine, and taught me how to conceptualise information. His humble efforts have had such a profound affect. I'm so very full of grief upon hearing this news.

    Some notable stories from Folklore.org:

    https://www.folklore.org/Joining_Apple_Computer.html

    https://www.folklore.org/Negative_2000_Lines_Of_Code.html --- something to bring up whenever lines of code as a metric is put forward

    https://www.folklore.org/Rosings_Rascals.html --- story of how the Macintosh Finder came to be

    https://www.folklore.org/I_Still_Remember_Regions.html --- surviving a car accident

    People today take the WIMP interface for granted and forget about the pioneers who invented it.

    It's really sad to see desktop apps adopt hamburger menus and things that make sense on mobile but make life harder on a desktop built for WIMP.

    Thank you, Bill! Some days I'd rather be using your interface.

    Atkinson's HyperCard was released in 1987, before the widespread adoption of the web. HyperCard introduced concepts like interactive stacks of cards, scripting, and linking, which were later adopted and expanded upon in the web. Robert Cailliau, who assisted Tim Berners-Lee in developing the first web browser, was influenced by HyperCard's hyperlink concept.

    For anyone (like me) wondering who this guy was, he was a prominent UI guy at Apple back in the day. According to Wikipedia he created the menu bar, QuickDraw, and HyperCard.

    For whomever submits stories like this, please say who the person was. Very few people are so famous that everyone in tech knows who they were, and Mr. Atkinson was not one of them. I've heard of his accomplishments, but never the man himself.

    • Adding a bit more context: The World Wide Web arguably exists because of HyperCard. The idea that information can be hyperlinked together.

      Atkinson was a brilliant engineer. As critical to the launch of A Macintosh as anyone — efficient rendering of regions, overlapping windows, etc.

      And last but not least, Mac Paint. Every computer painting program in existence owes Atkinson a nod.

      • The idea that information can be hyperlinked together predated HyperCard by decades. It goes back to https://www.theatlantic.com/magazine/archive/1945/07/as-we-m..., which was written in 1945. The same essay also has the fundamental ideas for a citation index.

        This gave rise both to the Science Citation Index and to various hypertext systems. For example the famous 1968 presentation https://www.youtube.com/watch?v=yJDv-zdhzMY, now known as "The Mother of All Demos", demonstrated a working hypertext system among the other jaw-dropping accomplishments.

        HyperCard brought hypertext to commodity hardware. The Web made a distributed hypertext system viable. Google's PageRank recombined hypertext and the Science Citation Index to make the web more usable. And all of the key insights trace back to Vannevar Bush. Who was able to have such deep insights in 1945 because he had been working in, and thinking about, computing at least since 1927.

        The history of important ideas in computing generally goes far deeper than most programmers are aware.

        3 replies →

      • > The idea that information can be hyperlinked together.

        HyperCard was really cool and I miss it. Its most important feature IMO was to enable non-programmers to rather easily author useful software. As happend with Excel.

        The idea that information can be hyperlinked is much older than HyperCard. Check out Ted Nelson and his https://en.wikipedia.org/wiki/Project_Xanadu which predates HyperCard by more than a decade.

        And then there was the https://en.wikipedia.org/wiki/Symbolics_Document_Examiner, or GNU Texinfo and its precursors besides many other attempts.

    • He was more then a prominent UI guy - back then he was designer and programmer - designing and coding the foundations.

    • People are showing you respect when they credit you with the ability to Google things yourself.

    • Several previous top-level comments address Atkinson's accomplishments, but I agree with you in principle.

    Thanks, Bill. Rest in Peace.

    I was amazed by Bill's software seeing it on a Mac back then - MacPaint mostly, then HyperCard. I was not even 10, but I was already programming, and spent hours trying to figure out how to implement MacPaint's Lasso on my humble ZX Spectrum. (With some success, but not quite as elegant...)

    If you want to experience HyperCard, John Earnest (RodgerTheGreat on HN[0]) built Decker[1] that runs on both the web and natively, and captures the aesthetic and most stuff perfectly. It uses Lil as a programming language - it is different than HyperTalk, but beautiful in its own right. (It doesn't read as English quite the way HyperTalk does, but it is more regular and easier to write - it's a readable/writable vector language, quite unlike those other ones ...)

    [0] https://news.ycombinator.com/user?id=RodgerTheGreat

    [1] https://beyondloom.com/decker/

    "How many man-years did it take to write QuickDraw?", the Byte magazine reporter asked Steve.

    Steve turned to look at Bill. "Bill, how long did you spend writing Quickdraw?"

    "Well, I worked on it on and off for four years", Bill replied.

    Steve paused for a beat and then turned back to the Byte reporter. "Twenty-four man-years. We invested twenty-four man-years in QuickDraw."

    Obviously, Steve figured that one Atkinson year equaled six man years, which may have been a modest estimate.

    http://folklore.org/StoryView.py?story=Mythical_Man_Year.txt

    Wow. One of the absolute greatest. The world truly is a different place because of Bill. Bill’s importance in the history of computing cannot be overstated. Hypercard is probably my favorite invention of his. So ahead of its time. Rest in peace Bill

    I know nothing about the fundamentals of “old computing” like what Mr. Atkinson worked on as I am only 27 and have much more contemporary experience. That being said, I still very greatly mourn the loss of these old head techs because the world of tech I use today would not have been possible if not for these incredibly smart and talented individuals. To learn to code without YouTube is truly a feat I could not imagine, and the world will be a lesser place without this kind of ingenuity. Hopefully he’s making some computers in the sky a bit better!

    • It's amazing to remember that there was an entire generation of computers and users for whom a command line was a new and modern invention!

    Bill Atkinson and Andy Hertzfeld were my childhood heroes through their work. Inside Macintosh was a series that enlightened my teen years. Thanks, Bill.

    Another death from pancreatic cancer. I really hope we can figure out why rates are skyrocketing because it is a silent killer and usually isn’t detected until it’s too late.

    I'm a little shook. A hero to many GenX coders I'm sure - I'm one of them. What a legend.

    Atkinson is a legendary UX pioneer. Great technical skill and a deep understanding of the principles of interaction. His work, from the double click to HyperCard, continues to inspire my own work. You will be missed.

    I was just telling someone about the story of how he invented bitmapping for overlapping windows in the first Mac GUI in like two weeks, largely because he mis-remembered that being already a feature in the Xerox PARC demo and was convinced it was already possible.

    RIP to a legend

    I wish I could have met him before he died.

    I'm yet another child of HyperCard. It opened my mind to what computers could be for, and even though the last two decades have been full primarily of disappointment, I still hold onto that other path as a possibility, or even as a slice of reality---a few weeds growing in the cracks of our dystopian concrete.

    The Mac, Hypercard, MacPaint, and General Magic he's one of the few engineers who's such a substantial impact on my life. Rest in Peace.

    HyperCard was my introduction to programming and delivered on the vision of personal computing as "bicycle for the mind." RIP

    Atkinson's work is so influential. From his contributions to the Macintosh team, to HyperCard, Bill was an inspiration to me and showed the power of merging art & technology.

    Thanks for everything, Bill — Rest in Peace.

    HyperCard opened my mind as a kid in a way that I couldn't grok until the first time I took Mushrooms. What a genius.

    A sad for me.

    I spent countless hours building HyperCard stacks and creating artwork in MacPaint, in college. A true legend.

    RIP. Fat Bits forever.

    I fondly remember creating simple narrative stories and games with HyperCard at 6 years old on my dad's Macintosh SE. It was my first contact with programming and a fundamental seed to using the computer as a creative tool. It has shaped my life in a substantial way. RIP Bill - HN bar should be blacked out.

    Some of his old demos of graphics capabilities on the Mac or hypercard are around on YouTube, and I watched some maybe 10 years ago. He displayed not just the tech chops but he was a good communicator. RIP.

    "Some say Steve used me, but I say he harnessed and motivated me, and drew out my best creative energy." - Bill Atkinson

    I loved his PhotoCard app as it allowed for image customization of the stamp and ability to be printed on very high quality card stock and ink.

    This post is a really beautiful farewell, thanks author for including some examples of his work to smile at.

    Why are so many original Apple people dying of pancreatic cancer? Is it that common and this a coincidence?

    I was just musing to a young team member the other day that I think OOP comes easy to me because I learned HyperCard (v1.2 on System 6 on an SE) at a young age. RIP.

    • This was my experience too. My mom had a subscription to Byte Magazine, and I remember trying to read the articles on OOP when they came out. It was utterly opaque to me. When I started using HyperCard, the light bulb turned on.

      I think a subtle factor is that when learning HC (or Visual Basic, or LabVIEW), you started using objects before you learned how to create them. All of these packages came with lots of pre-written objects that were easy to use. In the case of VB, you had to buy a special version if you wanted to create your own objects, and very few people did.

      I think when teaching newer languages like Python, this is done as a matter of course. For instance if you show someone how to calculate a function and graph it, you're probably using objects from something like Matplotlib, before being shown how to create your own. And once again, among casual programmers, relatively few people define their own classes.

    Oh man, he's a legend. My condolences to any family members passing by in remembrance. My highest respect goes to those with the tenacity and character required to force a good idea into existence. Bill inspired many people. While reading about him in "Revolution in the Valley", it felt like it recalibrated my own personal compass and gave me a sense of purpose in my own endeavors.

    My time with Atkinson came before the Macintosh, before Hypercard. As a company Apple was struggling and we were preparing for what, in retrospect, was the really terrible Apple III. It was a less optimistic time -- after the Apple II and before the Macintosh.

    A digression: the roster of Apple-related pancreatic cancer victims is getting longer -- Jef Raskin (2005), Steve Jobs (2011), now Bill Atkinson (2025). The overall pancreatic cancer occurrence rate is 14 per 100,000, so such a cluster is surprising within a small group, but the scientist in me wants to argue that it's just a coincidence, signifying nothing.

    Maybe it's the stress of seeing how quickly one's projects become historical footnotes, erased by later events. And maybe it's irrational to expect anything else.

    • Steve Jobs had pancreatic neuroendocrine tumor, which is not the traditional form of the pancreatic cancer people usually talk about. It is far less aggressive and completely treatable, in fact almost 100% curable as Jobs had it diagnosed at such an early stage.

    HyperCard was my introduction to programming. It was the first time I used a programming language on my mom’s old Macintosh IIci. It really has been a long time. Thank you, Bill.

    I remember Bill from the halcyon days, surrounded by smoke and mirrors. Amazing individual -- rest in Peace

    https://news.ycombinator.com/item?id=21779399

    DonHopkins on Dec 13, 2019 | parent | context | favorite | on: Bill Atkinson: Reflections on the 40th anniversary...

    I recently posted these thoughts about Bill Atkinson, and links to articles and a recent interview he gave to Brad Myers' user interface class at CMU: https://web.archive.org/web/20110303033205/http://www.billat...

    PhotoCard by Bill Atkinson is a free app available from the iTunes App store, that allows you to create custom postcards using Bill's nature photos or your own personal photos, then send them by email or postal mail from your iPad, iPhone or iPod touch.

    Bill Atkinson, Mac software legend and world renowned nature photographer, has created an innovative application that redefines how people create and send postcards.

    With PhotoCard you can make dazzling, high resolution postcards on your iPad, iPhone or iPod touch, and send them on-the-spot, through email or the US Postal Service. The app is amazingly easy to use. To create a PhotoCard, select one of Bill's nature photos or one of your own personal photos. Then, flip the card over to type your message. For a fun touch, jazz up your PhotoCard with decorative stickers and stamps. If you're emailing your card, it can even include an audible greeting. When you've finished your creation, send it off to any email or postal address in the world!

    pvg on Dec 13, 2019 | prev [–]

    Was this bit about LSD and Hypercard covered before what seems like a 2016 interview and some later articles? So much has been written about HyperCard (and MacPaint and QuickDraw) I'm wondering if I somehow managed to miss it in all that material.

    DonHopkins on Dec 13, 2019 | parent | next [–]

    As far as I know, the first time Bill Atkinson publically mentioned that LSD inspired HyperCard was in an interview with Leo Laporte on Apr 25th 2016, which claims to be "Part 2". I have searched all over for part 1 but have not been able to find it. Then Mondo 2000 published a transcript of that part of the interview on June 18 2018, and I think a few other publications repeated it around that time.

    And later on Feb 4, 2019 he gave a live talk to Brad Myers' "05-640: Interaction Techniques" user interface design class at CMU, during which he read the transcript.

    http://www.cs.cmu.edu/~bam/uicourse/05440inter2019/schedule....

    It's well worth watching that interview. He went over and explained all of his amazing Polaroids of Lisa development, which I don't think have ever been published anywhere else.

    See Bill Atkinson's Lisa development polaroids:

    http://www.cs.cmu.edu/~bam/uicourse/05440inter2019/Bill_Atki...

    Then at 1:03:15 a student asked him the million dollar question: what was the impetus and motivation behind HyperCard? He chuckled, reached for the transcript he had off-camera, and then out of the blue he asked the entire class "How many of you guys have done ... a psychedelic?" (Brad reported "No hands", but I think some may have been embarrassed to admit it in front of their professor). So then Bill launched into reading the transcript of the LSD HyperCard story, and blew all the students' minds.

    See video of Bill's talk:

    https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

    The next week I gave a talk to the same class that Bill had just traumatized by asking if they'd done illegal drugs, and (at 37:11) I trolled them by conspiratorially asking: "One thing I wanted to ask the class: Have any of you ever used ... (pregnant pause) ... HyperCard? Basically, because in 1987 I saw HyperCard, and it fucking blew my mind." Then I launched into my description of how important and amazing HyperCard was.

    See video of Don's talk:

    https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

    Here is an index of all of the videos from Brad Myers' interaction techniques class, including Rob Haitani (Palm Pilot), Shumin Zhai (text input and swipe method), Dan Bricklin (spreadsheets, Demo prototyping tool), Don Hopkins (pie menus), and Bill Atkinson (Mac, HyperCard):

    https://scs.hosted.panopto.com/Panopto/Pages/Sessions/List.a...

    Oh. I came here to pass the time as I built a TinyMac with a Pi and was compiling BasiliskII in SDL mode. I'm quite saddened by the news, as Bill was one of the people who had the most influence in the technical design of early Macs (and a brilliant engineer for all accounts).

    Why isn't the black bar up atop the site?

    • He’s definitely deserving of the black bar.

      This post is only an hour old as I’m writing this, so give it time. It’s a weekend, and as far as I’m aware there are only 2 mods, unless there are others empowered to turn on the black bar in their absence.

    [flagged]

    • Are you doing a Comic-Book-Guy impersonation?

      • No, just someone who isn't steeped in surveillance culture. My obtuseness is just a direct response to the obtuseness of the surveillance industry demanding "consent" when I'm trying to read about someone's death. I have also entered throwaway nyms for the online streams of family funerals that have tried to bundle abusive legal terms. What's socially gauche here is letting a moment of mourning turn into leverage for the surveillance industry.

    RIP. It still suprises me that people with resources die so early (he died at 74).

    • There isn't an amount of resources in the world that will protect you from cancer, despite what some claim. Like my grandma said, "it is your reward for surviving absolutely everything else that could have got you" (she beat 3 different kinds of cancer before losing to a 4th, with 'resources')

    • Bill pushed himself to his limits. I saw this first hand at General Magic, and heard the stories about the development of the Macintosh. People can wear themselves out.

    • I wouldnt consider 74 early.

      • It is early.

        “A 60-year-old male in the US can expect to live until about age 82”

        Pancreatic cancer usually is hard to detect until it’s reached an advanced stage. We really should invest more into research

    • Unless you're getting preventative screenings frequently, pancreatic cancer can be one of those ones that don't show any symptoms til you're already in stage 4. And most normal doctors will tell you to not do large amounts of preventative screenings.

    • > RIP. It still suprises me that people with resources die so early (he died at 74).

      You don't know for how long he did have that disease, if anything, resources might have afforded him many more years of life at first place.So your comment strikes me as odd, given the fact that you can't judge how long did he live with such disease.

      One of my friend's dad died from the same kind of cancer. Between the diagnosis and their death, 2 months passed, and that person had plenty of "resources"...

      • A friend of mine's diagnosis to death was less than a week. It all happened so fast, they couldn't process what had just happened.

        It happened during a family reunion for Christmas, so at least everyone was present.

    • Resources only help you reach your genetic potential, but if you’re just not built for longevity you still may not live long.

      And some people with no resources, no reason to live, but have incredible genetics will linger for many years beyond what people think is possible, like a weed.