← Back to context

Comment by varispeed

9 hours ago

One clue is that you cannot predict what key user is going to press next reliably, so the jitter would always be added to actual key press. You can minimise that by adding constant latency, so that you could simulate pulling events back in time, but still this is going to get complex quick and still could be filtered out. As for methods, it depends on the jitter. Think of things like noise removal in audio and adaptive filtering. Adding extra packets is much easier and more secure.

Okay I think I see the issue (and slight misunderstanding). I believe the problem is actually latency. I was assuming the jitter interval would be noticeably larger than the gap between typical (say 95%) of key presses. Any smaller than that and you start to need cover traffic.

Such an interval would still face correlation issues due to the varying nature of the overlap between the jitter intervals, however it seems like that should be trivial to address. That said, just throwing in some cover traffic is bound to be simpler.

But a jitter interval long enough that keystroke packets can change order is going to be noticeable to a human typing quickly on what should be a solid connection - my WiFi is only at 3 to 6 ms RTT and I already notice that versus a wired connection. That doesn't sound so trivial to fix, and once again just throwing in some cover traffic completely solves the issue.

So just do what's simple.

My next question was going to be, why on the order of 100 extra packets instead of just 1 or 2? But of course an attacker could attempt to search some set of permutations for recognizable words. So either you drown everything out (simple) or you hook a multilingual dictionary up to a key stroke delay model for your cover traffic generator (complex).

But really shouldn't this feature be implemented as some constant (low) background level of cover traffic that scales up as your typing frequency increases but caps out at some (still fairly low) rate? That seems both less likely to suffer from inadvertent leaks as well as not running afoul of the issue in the article.