Comment by 13of40
3 years ago
It might still be a valid test, because PowerShell needs to have a bunch of code in the stack between the keypress event and the call into the console API that actually displays the character. Among other things, the entire command line is getting lexically parsed every time you press a key.
If you think "parsing the command line" should or does take appreciable time on a human timescale when executed by a modern superscalar processor, then your mental model of computer performance is "off" by at least 4 or 5 orders of magnitude. Not four or five times incorrect, but many thousands of times incorrect.
Just for context, I worked on the feature team for a lot of early versions of PowerShell, so I kind of know where the bodies are buried. Here are some empirical numbers, just for the parsing part:
10K iterations of $null=$null took 38 milliseconds on my laptop.
10K parses of the letter "a" took 110 milliseconds.
10K parses of a 100K character comment took 7405 milliseconds.
10K parses of a complex nested expression took just over six minutes.
You're probably imagining a lexer written in C that tokenizes a context free language and does nothing else. In PowerShell, you can't run the tokenizer directly, you have to use the parser, which also builds an AST. The language itself is a blend of two different paradigms, so a token can have a totally different meaning depending on whether it's part of an expression or a command, meaning more state to track during the tokenizer pass.
On top of that, while it was being developed, language performance wasn't a priority until around version 3 or 4, and the main perf advancement then was to compile from AST to dynamic code for code blocks that get run a minimum number of times. The parser itself was never subject to any deep perf testing, IIRC.
Plus it does a bunch of other stuff when you press a key, not just the parsing. All of the host code that listens for the keyboard event and ultimately puts the character on the screen, for example, is probably half a dozen layers of managed and unmanaged abstractions around the Win32 console API.
All of those work out to be microseconds per parse, a far cry from the ten+ milliseconds that would be noticeable to humans.
The test is valid for any combo of shell and terminal, it's just a matter of figuring out which methodology was used so it can be better understood.
But yeah, I agree with the other comment that powershell is likely adding less than 1ms.
I just measured 3ms from a simulated keyboard event (through COM) to presence of the character in the console buffer, so that's OS time + PowerShell time without keyboard or screen time. Unfortunately measuring the same through CMD or a custom console app is more work than I care to put in tonight, so who knows what the real delta would be.