← Back to context

Comment by omoikane

19 hours ago

> send all typed characters at 50ms intervals

Wouldn't this just change the packet interval from 20ms to 50ms? Or did you mean a constant stream of packets at 50ms intervals, nonstop?

I think the idea behind the current implementation is that the keystrokes are batched in 20ms intervals, with the optimization that a sufficiently long silence stops the chaff stream, so the keystroke timing is obfucated with an increased error bar of 20ms multiplied by number of chaff packets.

I assume the problem, such as it is, relates to the fact that a real human typing in 20-50ms would generate a few characters at most but a program could generate gobs of data. So automatically you know what packets to watch. Then you know if there were more the likely keys were in set X, while if there were fewer the likely keys were in set Y.

So a clock doesn't solve the problem. The amount of data sent on each clock pulse also tells you something about what was sent.

The Chaff packets already fire on a timer. They inject random extra fake keystrokes so you can't tell how many keystrokes were actually made. The only other way I can think of to solve that is by using a step function: Send one larger packet (fragmented or the same number of individual packets) on each clock pulse if the actual data is less than some N where N is the maximum keystrokes ever recorded with some margin. Effectively almost every clock pulse will be one packet (or set of packets) of identical size. Of course if you do that then you'll end up consuming more data over time than sending random amounts of packets.

[flagged]

  • The problem is not knowing whether someone is typing, as far as I understand. But that you may extract some information about what keys are being typed, based on the small differences in timings between them.