← Back to context

Comment by CGamesPlay

6 months ago

(Article is from 2023, so the title should be updated to say "32 years ago", or something)

The biggest loss in TUIs is the latest wave of asynchronous frameworks, which bring the joy of dropped keypresses to the terminal.

In any TUI released before the year 2000, if you press a key when the system wasn't ready, the key would just wait until the system was ready. Many TUIs today still do this, but increasingly frequently (with the modern "web-inspired" TUI frameworks), the system will be ready to take your keypress, and discard it because the async dialog box hasn't registered its event listener yet.

Other than that antipattern, TUIs are doing great these days. As for terminal IDEs, Neovim has never been more featureful, with LSPs and other plugins giving all the features this article discusses. I guess it isn't a mouse-driven TUI, so the author wouldn't be interested, but still.

Yes. Back in the DOS days, and even before, when people used actual terminals, there was a keystroke buffer. You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI. Stuff would just flash onto the screen and disappear as it processed the input that was already in its buffer. It should be possible to implement this with modern frameworks, but it requires thought.

  • Yeah. I used to work as a phone surveyor, the one you hate. Our software is a terminal connected to a mainframe. I got used to it after a few weeks and was very productive.

    Costco Canada vision shops still use a terminal connected to an AS/400 machine as I snooped around last month.

    • In the late 90s I was required to slowly replace dumb terminals with PCs. One of the older ladies taking phone orders was most put out by this, understandably. She was lightning fast on that terminal. She'd never used a PC (I hit on the idea of using solitaire to learn to use a mouse, which worked amazingly well), and was never able to get to the same speed with one as she'd done on her dumb terminal. It's hard to beat the performance of dedicated devices.

      9 replies →

  • Fun story: When I worked at blockbuster I had my computer access revoked and summoned to explain because a colleague told management I was “hacking” when they saw me doing this on the computer system.

    • Makes me wonder if that’s where the TV trope of a hacker flying through screens faster than you can see came from

    • Was that still on the VMS-based blockbuster video system?

      Weird question, but I accidentally ended up with one of those in my hands that ran in probably non-blockbuster place from 1996 to 2000 :)

  • > You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI.

    I remember.

    This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition, but the trigger was the flying fingers of experts typing ahead, unknowingly having been trained to rely on the hardware interlock present in older models.

    • > This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition

      So it didn't kill people, something else was that cause

      1 reply →

  • Remember the venomous, desperate BEEP! when the keystroke buffer was full. (Or was it when pressing too many keys at once?) Like a tortured waveform generator constantly interrupted by some higher-priority IRQ. Good times.

  • The keyboard buffer size was something like sixteen keystrokes. This was bad news if you noticed your input wasn't working and you needed to press CTRL + whatever to quit the program since the buffer was full and unable to accept the CTRL + whatever. Instead it had to be CTRL + ALT + DEL.

    Three decades later I learn that there were utilities to make the keyboard buffer bigger. But, in those days before search engines, how was I to know?

Yes! That phenomenon drives me crazy. I used to be able to use a computer at warp speed by staying ahead of its responses with chains of rapid keyboard shortcuts etc. Now it's like I'm trying to sride through molasses.

Rough quote: "in 1984 we had at my house",

so even 41 years seems to be in the scope.

I was expecting

- early projects that ended in Visual Studio 1.0 or NetBeans soon after, (2 to 9 years too early for them)

not

- "vim (1991) was not out yet" (not-a-quote, but my feeiling upon looking at ncurses instead of floating windows)

  • I snickered a little because I know Visual Studio didn't have a version 1.0. Wikipedia identifies the first version as Visual Studio 97, which was at version 5.0. I remember before that there was "Microsoft Developer Studio 4.0" which came out around Windows 95, and could run on 95 or on NT 3.51. There was a Visual C++ 1.0 and a Visual Basic 1.0 released at different times. Meanwhile there were also the workhorses, Microsoft C and MASM. In those days, Borland and Watcom were real competitors to Microsoft for C and C++.

  • Yeah, by 1995, Visual Basic / C++, Delphi / Borland C++, and Symantec C++ were all-conquering.

    A few years before, it was very different - VisualAge and Rational Application Developer were the big names in the early 90s in "professional" IDEs. Interface Builder for university spin-outs or funky startups (and SunWorks / Forte Studio for the less-funky ones). CodeWarrior on the Mac (perhaps with THINK! hanging on too). I think Softbench was popular for scientific software, but I never actually saw it myself.

    And then just a few years later, the rise of Java turned things upside down again and we got Jbuilder, Visual Cafe, & NetBeans as the beginning of yet another new wave. The Visual Studio suite really began to take off around then, too.

    In short, the 90s were a time of huge change and the author seems to have missed most of it!

    • An all-in-one like Rational Rose may be making a comeback in terms of these agentic AI projects, because now you actually can turn a spec into code without layers of tagging and UML.

I wasn't paying attention to when 30 years ago actually was...

So disappointing to expect a GUI Smalltalk System Browser and seeing DOS TUIs.

And then delight recalling Turbo C/Pascal and MS C 4.0 with CodeView that even worked in 43 or 50 line modes.

  • Yes, me too, I was expecting either Smalltalk or LISP machine GUIs.

    Having said that, some old TUIs were clearer and faster even on weaker hardware. This should be a lesson for us today. Color transitions and animated icons flying over the desktop are NOT what I need, but speed, clarity, and discoverability of more rarely used functionality are vital.

When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment. And possibly to the detriment of the code itself.

> As for terminal IDEs

The GNU/Linux terminal is the killer app. Multiple terminals in a tiling window manager is peak productivity for me. (Browser in a separate virtual workspace.)

And modern scaling for a big display is unbeatable for developer ergonomics.

  • > When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment.

    I think you are wrong.

    https://en.wikipedia.org/wiki/Muscle_memory

    Being extremely good at something increases the gap between said something and everything else. That doesn't mean being extremely good at the first thing is "over-specialization to detriment". If someone is equally mediocre at everything, they have no such gap, so no "over-specialization to detriment"; but is that really worth desiring? I think not.

    • > Being extremely good at something increases the gap between said something and everything else.

      You're also potentially over-specializing at one level while at the same time neglecting other levels.

      Musicians run into this problem when, for example, they rely solely on muscle memory to make it through a performance. Throw enough stress and complicated music at them and they quickly buckle.

      Meanwhile, a more seasoned performer remembers the exact fingers they used when drilling the measure after their mistake, what pitch is in the bass, what chord they are playing, what inversion that chord is in, the context of that chord in the greater harmonic progression, what section of the piece that harmonic progression is in, and so forth.

      A friend of mine was able to improvise a different chord progression after a small mistake. He could do this because he knew where he was in the piece/section/chord progression and where he needed to go in the next measure.

      In short, I'm fairly certain OP is talking about these levels of comprehension in computer programming. It's fine if someone is immensely comfortable in one IDE and grumpy in another. But it's not so fine if changing a shortcut reveals that they don't understand what a header file is.

  • Why is it to their detriment? It's not like they're stuck with it forever. "Can't work without it" is really "won't work without it because they prefer installing it over going without."

  • As someone that started when only rich people could afford GUIs, I don't understand what is killer app about it.

    We used text terminals because that is what we could afford, and I gladly only start a terminal window when I have to.

    • > I gladly only start a terminal window when I have to.

      Exactly so. I am perfectly able to work entirely in a text/CLI world, and did for years. I don't becase I don't have to. I have better, richer alternative tools available to me now.

      It was very odd to join Red Hat briefly in 2014 and meet passionate Vi advocates who were born after I tried Vi and discarded it as a horrible primitive editor.

[flagged]

  • People should also stop using terminal emulators. It is pretty silly to base software around ancient printing terminals. Everyone knows for a fact that only tech illiterates use a console instead of a GUI. Since all great devs use a GUI. Just a fact.

    Also, people should stop playing 2D games. It is pretty silly to base your entertainment on ancient technology when modern GPUs can render super-complex 3D scenes.

    And don't make me start on people who still buy vinyl...

    • Current GPU's can't compete with my brain 'rendering' a Slash'em/Nethack scene with my pet cat while I kick ass some foes with my Doppleganger Monk full of Wuxia/Dragon Ball/Magical Kung Fu techniques.

    • Honestly hard to disagree with your first point even though it's sarcasm.

      It's still quite easy to end up with a terminal you need to reset your way out of (eg with a misguided cat), not to mention annoying term mismatches when using remix/screen over SSH, across OSes, or (and this is self inflicted) in containers.

    • Completely disingenuous. Stop the snark.

      For UI there exists a straight up superior alternative, which keeps all of the benefits of the old solution. Neovim is just straight up better when used outside of a terminal emulator.

      What is true for TUI vs. GUI is not true for CLI vs. GUI (or TUI for that matter) pretending the argument I made applies to the later is just dishonest. You can not replace CLI interfaces adequately by GUI or TUI interfaces, you can totally replace TUI Interfaces by GUI. See neovim as an example. It is superior software when used outside of the terminal.

      2 replies →

  • TUIs are the best cross platform apps. They run on all the major and minor platforms in general use. GUIs cannot compete with browsers being the next closest thing. They can be integrated with the shell and also work perfectly well remotely w/o issues. TUIs are superior in many ways to GUIs and have a place in the ecosystem.

    • > TUIs are superior in many ways to GUIs and have a place in the ecosystem.

      There's another reason you don't mention.

      Consistent UI.

      TUI apps can (and in the Windows world usually do) use the same keyboard controls, derived from IBM CUA, as their GUI equivalents do.

      This is why I use Tilde in the Linux shell: the same commands work in it as in Pluma or Leafpad or Mousepad or whatever: Ctrl+O opens a file, Ctrl-X/C/V to cut/copy/paste, Ctrl+N for new, etc.

    • TUIs do not even run the same across terminal emulators.

      It is a total joke to call something which depends on how the underlying terminal emulator interprets specific ANSI escape sequences "multi platform".

  • Modern terminals do color just fine-- 24 bit color support has existed since 2010-ish, and been mainstream since 2015.

    There's nothing wrong with graphical IDEs... or text user interfaces. Great developers use both. Low effort troll is low effort.

    • +1 - crap code can come out of notepad / emacs / vi or IDE-flavor-of-the-day or even the AI code sausage maker. Testing, specification, knowing what you are building and why still matters.

  • Agreed, we used TUIs because we couldn't afford anything better on MS-DOS, CP/M, 8 bit home computers.

    People on better systems like the Amiga and Atari were already past that.

  • SSH comes to mind.

    • How so? I use remote machines all the time, why would I need a TUI for that? VSCode and zed support editing on remote machines and the machine drives are also mounted on the local machine? What purpose would any TUI have? What even are the potential benefits?

      Right now I can use the exact same software I use on my local machine. Can you give me any reason why I should consider anything else?