← Back to context

Comment by mehrdadn

6 years ago

That's most definitely not how security works. The strength of your password is not proportional to the number of bits of entropy it has.

The way you're phrasing this may be misleading.

The strength of a password / passphrase increases with the power of 2 raised to the bits of entropy.

That's an exponential proportion, rather than a linear one. But a proportion all the same.

Example:

Given mixed-case alphanumeric (62 characters) and an 8-character password length, the number of combinations is:

    62^8 = 218,340,105,584,896 (keyspace -- 218 quadrillion)
    l(62^8)/l(2) = 47.6 (bits of entropy)

A 10 character password (if randomly chosen from the same character set) has 10^17 possibly combinations (about 4,000x more), and 59.4 bits of entropy, 11.8 bits more. 2^11 = 2048.

In the context of randomly generated passwords, it's absolutely ok to think about it in terms of the logarithmic relationship between 1) entropy per symbol times number of symbols and 2) strength of the password.

He said 10% stronger (which I took to mean 10% more entropy), not 10% more time to crack.

  • > He said 10% stronger (which I took to mean 10% more entropy), not 10% more time to crack.

    Hence the problem?

    Yes, measuring "strength" by "bits of entropy" is technically correct (the best kind of correct...).

    It's also exponentially misleading... possibly the worst kind of misleading?

    Just look at the question: "Is there even a reason to include special characters in passwords? They add 10% more to security...". I don't know about you, but to me doesn't really portray an understanding of the fact that it takes twenty-five times longer to crack such a password for merely 8 characters, not merely 10%.

    • I mean, counting in entropy with the knowledge that the applied effects can be logarithmic is the standard way of discussing such matters. It's sort of the basis for the information theory that's underneath this type of work.

      Edit: And the point of his argument is that more symbols of a smaller corpus of symbols can be equivalent if the entropy is equivalent.