← Back to context

Comment by unscaled

8 months ago

Back in 2012, TLS was not enabled everywhere yet. OAuth 1.0 was based on client signatures (just like JAR, DPoP etc., but far simpler to implement) and it was a good fit for its time. One of Eran Hammer's top gripes with the direction OAuth 2.0 was going for is removing cryptography and relying on TLS. I think this turned out to be a good decision, in hindsight, since TLS did become the norm very quickly, and the state of cryptography at IETF during that period (2010) was rather abysmal. If OAuth 2.0 did mandate signatures, we'd end up with yet another standard pushing RSA with PKCS#1 v1.5 padding (let's not pretend most systems are using anything else with JWT).

But that's all hindsight is 20:20, I guess. I think the points that withstood the state of time more is about how OAuth 2.0 was more of a "Framework" than a real protocol. There are too many options and you can implement everything you want. Options like the implicit flow or password grant shouldn't have been in the standard in the first place, and the language regarding refresh tokens and access tokens should have been clearer.

Fast forward to 2024, I think we've started going back to cryptography again, but I don't think it's all good. The cryptographic standards that modern OAuth specs rely on are too complex, and that leads to a lot of opportunity for attacks. I'm yet to see a single cryptographer or security researcher who is satisfied with the JOSE/JWT set of standards. While you can use them securely, you can't expect any random developer (including most library writers) to be able to do so.