← Back to context

Comment by jogux

8 months ago

Yes, indeed. Both OAuth 2.1 & the BCP tighten things up a lot, although neither is technically final yet (the security BCP should be published as an RFC "any day now").

For people looking for an easy-to-follow interoperability/security profile for OAuth2 (assuming they're most interested in authorization code flow, though it's not exclusive to that) FAPI2 is a good place to look, the most recent official is here:

https://openid.net/specs/fapi-2_0-security-profile.html

This is a good bit shorter than the 50 or so pages of security BCP :)

FAPI2 should also be moving to 'final' status within the next few months.

(Disclaimer: I'm one of the authors of FAPI2, or will be once the updated version is published.)

On the flip-side, it is much more complex to implement than OAuth 2.1, since it mandates a lot of extra standards, some of them very new and with very little to go in the way of library support: DPoP, PAR, private_key_jwt, Dynamic Client Registrations, Authorization Server Metadata, etc.

Except for PAR, these extra requirements are harder to implement than their alternatives and I'm not sold that they increase security. For instance, DPoP with mandatory "jti" is not operationally different than having a stateful refresh token. You've got to store the nonce somewhere, after all. Having a stateful refresh token is simpler, and you remove the unnecessary reliance on asymmetric cryptography, and as a bonus save some headaches down the road if quantum computers which can break EC cryptography become a thing.

In addition, the new requirements increase the reliance on JWT, which was always the weakest link for OIDC, by ditching client secrets (unless you're using mTLS, which nobody is going to use). Client secrets have their issues, but JWT is extremely hard to secure, and we've got so many CVEs for JWT libraries over the years that I've started treating it like I do SAML: Necessary evil, but I'd minimize contact with it as much as I can.

There are also some quirky requirements, like:

1. Mandating "OpenID Connect Core 1.0 incorporating errata set 1 [OIDC]" - "errata set 2" is the current version, but even if you update that, what happens if a new version of OIDC comes out? Are you forced to use the older version for compliance?

2. The TLS 1.2 ciphers are weird. DHE is pretty bad whichever way you're looking at it, but I get it that some browsers do not support it, but why would you block ECDSA and ChaChaPoly1305? This would likely result in less secure ciphersuites being selected when the machine is capable of more.

In short, the standard seems to be much better than than FAPI 1.0, but I wouldn't say it's in a more complete state than OAuth 2.1.

  • DPoP isn't mandated, MTLS sender constrained access tokens are selected by a lot of people instead of DPoP. (And yes I agree, MTLS has challenges in some cases.)

    Stateful refresh tokens have other practical issues, we've seen several cases in OpenBanking ecosystems where stateful refresh tokens resulted in loss of access for large numbers of users which things went wrong.

    The quirks you mention are sorted in the next revision. The cipher requirements come from the IETF TLS BCP [1] (which is clearer in the new version). If you think the IETF TLS WG got it wrong, please do tell them.

    As other people said elsewhere, this isn't about completeness - OAuth2.1 is a framework, FAPI is something concrete you can for the large part just follow, and then use the FAPI conformance tests to confirm if you correctly implemented it or not. If you design an authorization code flow flowing all the recommendations in OAuth 2.1, you'll end up implementing FAPI. Most people not in this space implementing OAuth will struggle to know how to avoid the traps once they stop following the recommendations, as "implementing OAuth securely" isn't usually their primary mission.

    1: https://www.rfc-editor.org/info/bcp195

    • How common is it to use MTLS in the user-to-service use case (e.g. browsers with mTLS configured)? I mean for (potentially external) service-to-service authentication it's way easier then for user(browser,app)-to-service.