← Back to context

Comment by dathinab

8 months ago

Also interesting is that OAuth2 is a bit too flexible in how you can put things together and OIDC provides a lot of good practice about that how.

So even systems where OIDC compliance is a non goal often are partially OIDC compliant, I mean there really is no reason to reinvent the wheel if part of the OIDC standard already provide all you need.

OAuth 2.1 tightens it up a bit.

I don't think very many people know that OAuth 2.1 exists, though.

  • 2.1. is still in draft stage.

    • 2.1 is mainly updating 2.0 with various later RFCs and usage recommendations, many of them are not drafts. And some document used e.g. https://datatracker.ietf.org/doc/html/draft-ietf-oauth-secur... are technically a draft but practically a "life" document constantly kept up to date

      So while technically 2.1 is a draft, practically not following it means not following best practices of today so while you don't have to meet it fully if you care about security you really should use it as strict/strong recommendations. At least _longterm_ not doing so could be seen as negligence for high(er) security applications.

      The most important changes are listed here https://oauth.net/2.1/

OIDC is still far more flexible than I would have liked. It still allows the implicit flow, and it even created a new abominations that didn't exist in OAuth: The Hybrid Flow. If you just want to follow best practices, OAuth 2.1[1] or OAuth Best Current Practices[2] are far better options.

[1] https://oauth.net/2.1/

[2] https://oauth.net/2/oauth-best-practice/

  • Yes, indeed. Both OAuth 2.1 & the BCP tighten things up a lot, although neither is technically final yet (the security BCP should be published as an RFC "any day now").

    For people looking for an easy-to-follow interoperability/security profile for OAuth2 (assuming they're most interested in authorization code flow, though it's not exclusive to that) FAPI2 is a good place to look, the most recent official is here:

    https://openid.net/specs/fapi-2_0-security-profile.html

    This is a good bit shorter than the 50 or so pages of security BCP :)

    FAPI2 should also be moving to 'final' status within the next few months.

    (Disclaimer: I'm one of the authors of FAPI2, or will be once the updated version is published.)

    • On the flip-side, it is much more complex to implement than OAuth 2.1, since it mandates a lot of extra standards, some of them very new and with very little to go in the way of library support: DPoP, PAR, private_key_jwt, Dynamic Client Registrations, Authorization Server Metadata, etc.

      Except for PAR, these extra requirements are harder to implement than their alternatives and I'm not sold that they increase security. For instance, DPoP with mandatory "jti" is not operationally different than having a stateful refresh token. You've got to store the nonce somewhere, after all. Having a stateful refresh token is simpler, and you remove the unnecessary reliance on asymmetric cryptography, and as a bonus save some headaches down the road if quantum computers which can break EC cryptography become a thing.

      In addition, the new requirements increase the reliance on JWT, which was always the weakest link for OIDC, by ditching client secrets (unless you're using mTLS, which nobody is going to use). Client secrets have their issues, but JWT is extremely hard to secure, and we've got so many CVEs for JWT libraries over the years that I've started treating it like I do SAML: Necessary evil, but I'd minimize contact with it as much as I can.

      There are also some quirky requirements, like:

      1. Mandating "OpenID Connect Core 1.0 incorporating errata set 1 [OIDC]" - "errata set 2" is the current version, but even if you update that, what happens if a new version of OIDC comes out? Are you forced to use the older version for compliance?

      2. The TLS 1.2 ciphers are weird. DHE is pretty bad whichever way you're looking at it, but I get it that some browsers do not support it, but why would you block ECDSA and ChaChaPoly1305? This would likely result in less secure ciphersuites being selected when the machine is capable of more.

      In short, the standard seems to be much better than than FAPI 1.0, but I wouldn't say it's in a more complete state than OAuth 2.1.

      2 replies →

  • Preferable the intersection of OIDC, OAuth 2.1 and Best Current Practices.

    As in use OAuth2.1 recommendations including OAuth Best Current Practices to structure you clients inside of the OIDC framework which mostly will "just work" (if you are in control of the auth server) as it only removes possible setups you decide not to use from OIDC.

    Through I'm not sure if requiring (instead of just allowing) PKCE is strictly OIDC compliant but implementations should support it anyway ("should" as in it recommended for them to do so, not as they probably do so).

    • "Through I'm not sure if requiring (instead of just allowing) PKCE is strictly OIDC compliant"

      It's technically not compliant, but people definitely do so, and there are definite security advantages to requiring it.

      Technically the 'nonce' in OpenID Connect provides the same protections, and hence the OAuth Security BCP says (in a lot more words) that you may omit PKCE when using OIDC. However in practice, based on a period in the trenches that I've mostly repressed now, the way the mechanisms were designed means clients are far more likely to use PKCE correctly than to use nonce correctly.)

  • Both systems have grown too much and are too complicated. At some point someone will replace it with something easier.

    I used countless ID providers and while they offer a very valuable service, the flows are too complicated and many implementations have a lot of security flaws because the users don't understand the implications.

    The implicit flow was trying to make it easier.