OpenID Connect specifications published as ISO standards

8 months ago (self-issued.info)

It took me an embarassingly long time (given how keenly involved I was in OpenID stuff ~17 years ago https://simonwillison.net/search/?tag=openid&year=2007) to understand that OpenID Connect is almost unrelated to the original idea of OpenID where your identity is a URL and you can prove that you own that URL.

OpenID Connect is effectively an evolution of OAuth.

  • You may already know this, I'm writing it as a note for my future self.

    OpenID Connect (OIDC) is mostly concerned with authentication. On the other hand, OAuth (or, to be more specific, OAuth v2.0) is concerned with authorization.

    >> OpenID Connect is effectively an evolution of OAuth.

    In my opinion, OpenID Connect is actually evolution of OpenID – in its vision/spirit:

    - OIDC, like OpenID, primarily focuses on users' identity and authentication;

    - OIDC, unlike OpenID, didn't (re)invent new authentication workflows, which were significantly different in their own ways. Instead, it built authentication workflows on top of existing OAuth spec (which was being (ab)used for authentication in some places which, unfortunately, is still the case) to achieve its main objective (i.e. authentication).

    ---

    Edit: rephrased to better communicate my thoughts (still not perfect; but, as the saying goes, perfect is the enemy of the good so I stop here).

    • > OIDC, unlike OpenID, didn't (re)invent new authentication workflows, which were significantly different in their own ways. Instead, it built authentication workflows on top of existing OAuth spec

      Didn't OpenID predate OAuth? What should OpenID have built upon?

      2 replies →

    • My problem with it being called OpenID Connect is that, in my head, an OpenID is a noun which means "a URL that you can use as your identity and prove that you own".

      That definition doesn't work for OpenID Connect. Is OpenID a noun any more? I don't think it is.

      6 replies →

    • > OpenID Connect (OIDC) is mostly concerned with authentication. On the other hand, OAuth (or, to be more specific, OAuth v2.0) is concerned with authorization.

      That's a common refrain and it's quite inaccurate in that it's using a rather unorthodox definition of these terms. It's like the classic "The United States is not a Democracy, it is a Republic", where the speaker reinterprets Democracy as "Direct Democracy" and Republic as "Representative Democracy"[1].

      Same goes for "authorization" and "authentication" in OAuth and OIDC. In the normal sense, authentication deals with establishing the user's identity, while authorization determines what resources the user can access.

      Open ID Connect does indeed deal with authentication: it's a federated authentication protocol. The old OpenID also tried to introduce the concept of a globally unique identity in addition to authenticating that identity, as the GP mentioned. But OpenID Connect still supports federation: An application (the consumer) can accepts users logging in through a completely unrelated service (the identity provider). I believe this was originally specified mostly with the idea of third-party authentication in mind (using Google or Apple to log in to another company's service or conversely using your corporate SSO to log in to a SaaS web app), but microservices are very popular nowadays, even services that don't support external login often use OIDC as the authentication protocol between their authentication microservice and other services.

      OAuth on the hand, started as a method for constrained access delegation. It allowed web services to issue a constrained access token that is authorized for doing a certain set of operations (defined by the "scope" parameter and often explicitly approved by the user). Getting constrained access through OAuth requires performing authentication, so you can say OAuth is also an authentication standard in a sense. But since OAuth was designed for pure deelgation, it does not provide any user identity information along with its access token, and there was no standard way to use it for federated authentication. Open ID Connect essentially takes OAuth and adds identity information on top. To make things more complicated, there have been a lot of OAuth specifications published as RFCs[2] over the last decade, and a lot of them deal with client authentication and explicitly mentioning Open ID Connect (since arguably most OAuth implementations are also OIDC implementations now).

      In short, Open ID Connect is quite accurately described as an Authentication standard. But OAuth 2.0 has little to do with Authorization. It allows clients to specify the "scope" parameter, but does not determine how scopes are parsed, when user and client are permitted to request or grant a certain scope and what kind of access control model (RBAC, ABAC, PBAC, etc.) is used. That's ok, since it leaves the implementers with a lot of flexibility, but it clearly means OAuth 2.0 is not an authorization standard. It only concerns itself with requesting authorization in unstructured form[3].

      Better examples for proper authorization standards are declarative authorization specification DSLs like XACML, Alfa, Rego (the language using in Open Policy Agent). I guess you could also put Radius in as an example of a protocol that implements both Authentication and Authorization (although the authorization is mostly related to network resources).

      ---

      [1] To be fair, the meanings of "Democracy" and "Republic" have changed over time, and back in the 18th century, when the US was founded, it was popular to view the Athenian Democracy as the model case of a pure direct democracy and to use the term "Democracy" in this sense. Over time, the meanings changed and we got a weird aphorism that remains quite baffling to us non-Americans.

      [2] https://oauth.net/2/

      [3] RFC 9396 is a very recently published optional extension to OAuth that does structured authorization requests, and defines a standard method for resource server to query the requested and granted authorization data.

      4 replies →

  • Also interesting is that OAuth2 is a bit too flexible in how you can put things together and OIDC provides a lot of good practice about that how.

    So even systems where OIDC compliance is a non goal often are partially OIDC compliant, I mean there really is no reason to reinvent the wheel if part of the OIDC standard already provide all you need.

  • The naming is a complete nightmare.

    OpenID Connect is an extension to OAuth2 (RFC 6749) that adds an authentication layer (to identify a user) on top of the OAuth2 authorization framework (that grants permissions)

    Earlier versions of OAuth (1.0 and 1.0a) and OpenID (1 and 2) are unrelated, incompatible protocols that share similar names but are largely irrelevant in 2024.

    Google with care.

    • Also "OpenID Connect" sounds like some sort of branded product rather than a technical standard.

  • Your talk at webstock back in 2008 is what originally got me interested in OpenID.

  • Yeah, it's a profile on top of OAuth, which leverages aspects (the authorization code grant, tokens) but adds some other functionality (another token with authentication information and some defined claims). I'm not aware of any other profiles with anywhere near the uptake of OIDC.

    There are a few folks out there doing pure OAuth, but much of the time it is paired with OIDC. It's pretty darn common to want someone to be authenticated the same time they are authorized, especially in a first party context.

    I wrote more on OIDC here: https://fusionauth.io/articles/identity-basics/what-is-oidc

  • No, OIDC is not an Evolution of OAuth. One does authentication, the other authorization. Two very different, but often intertwined concepts, where that both can also be used without requiring the other.

  • Is it a question of how bad OAuth 2.0 is then?

    “”” David Harris, author of the email client Pegasus Mail, has criticised OAuth 2.0 as "an absolute dog's breakfast", ””” https://en.m.wikipedia.org/wiki/OAuth

    I keep on trying and failing to implement / understand OAuth 2 and honestly feel I need to go right back to square one to grok the “modern” auth/auth stack

    • It’s funny, I’m the opposite. I love OAuth for what it does, that is, federate permission to do stuff across applications. It makes a lot of cool integration use cases possible across software vendor ecosystems, and has single-handedly made interoperability between wildly different applications possible in the first place.

      I’d say it definitely helps to implement an authorisation server from scratch once, and you’ll realise it actually isn’t a complex protocol at all! Most of the confusion stems from the many different flows there were at the beginning, but most of it has been smoothened out by now.

    • Eran Hammer (the author of OAuth 1.0 and original editor of the OAuth 2.0 spec) resigned during the early draft specification process and wrote a more detailed criticism[1].

      I don't think I agree with every point he makes, but I think he had the right gist. OAuth 2.0 became too enterprise-oriented and prioritized enterprise-friendliness over security. Too many implementation options were made available (like the implicit grant and the password grant).

      [1] https://gist.github.com/nckroy/dd2d4dfc86f7d13045ad715377b6a...

      2 replies →

No aspect of this is good for anyone. First, standards you have to pay to obtain are a really, really bad thing. Second, I wish more effort would go into designing standards and implementations that aren't such an endless time sink when you need them.

  • I agree about ISO, but I don't think there's a meaningful "toll gate" in this case: the standards are already free and public, this seems to just assign them identities in the ISO's standardization namespace.

    (I'm at a loss to explain what benefit comes from being assigned an ISO standard versus putting a HTML document on the Internet.)

    • > (I'm at a loss to explain what benefit comes from being assigned an ISO standard versus putting a HTML document on the Internet.)

      From the article:

      "[ISO certification] should foster even broader adoption of OpenID Connect by enabling deployments in jurisdictions around the world that have legal requirements to use specifications from standards bodies recognized by international treaties, of which ISO is one."

      6 replies →

    • Any sort of government or similarly "official" organization loves to refer to ISO standard XXXX instead of writing out a summary of the standard when they document things.

      Sometimes you see the same thing with organizations referring to web RFCs. It's likely because of a general culture of "don't try to invent new things if you already have a reference for it", although it doesn't really tend to make those documents readable.

    • > (I'm at a loss to explain what benefit comes from being assigned an ISO standard versus putting a HTML document on the Internet.)

      Single source of truth. The internet has been plagued by numerous incompatible implementations of the same thing. There are numerous tests [0] showing incompatibility between simple serialization format JSON. How many times have you heard "Yeh, nice feature, but virtually nothing implements it"? A standard becomes whatever majority of highly adopted implementations do instead of formal specification. This is what you get for putting a HTML document on the internets. ISO standardization somewhat reduced this effect.

      > but I don't think there's a meaningful "toll gate" in this case: the standards are already free and public

      Major problem with ISO standards is that they cross-reference each other. It's rare NOT to find definition "X as defined in ISO 12345". Complex product may need to reference hundreds of ISO standards.

      Somewhat tautologically I agree with you as in reality things are probably going to be implemented referencing tutorial subtly incompatible tutorials on the internet but will claim ISO compatibility.

      [0]: https://www.getlazarus.org/json/tests/

      1 reply →

    • > I don't think there's a meaningful "toll gate" in this case: the standards are already free and public

      See Adobe and PDF: PDF 1.7 was available gratis from Adobe and also (“technically identical to”) an ISO standard. At the time, people expressed concerns about ISO’s paywalls and Adobe reassured them there was an agreement to ensure that wouldn’t happen. Indeed it did not... until PDF 2.0 came along, developed at the ISO, and completely paywalled.

      I seem to remember (but don’t quote me on that) that AVIF and JPEG XL standards were at one point downloadable free of charge. In any case, they aren’t today.

      1 reply →

  • Why does the internet not make this stuff a RFC? Email and TCP are RFCs so are other critical aspects and global companies use the all the time.

    • Historically the IETF has been reluctant to get involved with Identity (and hence authentication) for various reasons. There are a few standards bodies in this area and they all have their strengths and weaknesses (the presentation by Heather Flanagan someone linked to elsewhere in the thread gives a good introduction).

      Even some RFCs are basically available as ISO standards and vice versa, e.g. for time/date formats you almost never need to buy ISO8601 and can just read RFC3339 (which is technically a 'profile' of ISO8601).

Standards are nice, but the large standards organizations like ISO annoyingly charge a bit to view them. I suppose this is because some businesses/industries require "real" standards by those orgs rather than the IETF or other dirty open-source hippie collectives.

  • "Standard" and "costs money" feel at odds with each other. If you want something to become standard, as in the standard/most common way of doing something, it has to be abundantly accessible so that it can be widely implemented.

    • It's a double edged sword. Actually creating a good standard that people want to use through an open process that aims to be unbiased takes a non-trivial amount of time and hence costs a not insubstantial amount of money. Different standards organisations have chosen different approaches to solving that problem, and although I completely agree and freely available standards are my preferred approach, it is also very clear that ISO standards are well respected and widely used despite the need to pay to view them in some cases.

      1 reply →

  • Most standards are low priced enough that they are sold at a loss. If you would prefer to donate to ISO (or the IEEE for another example) instead, that would help allay the cost of writing a standard.

  • My understanding it is quite often government/country contexts where (because ISO is recognised in various international treaties) it is easier to get approval to use an ISO standard than it is to use an OpenID Foundation standard. So getting OpenID Connect published with an ISO number just makes adoption easier for some projects.

    OpenID Connect does of course remain free view/use, but now people in the above situation have an easier option available to them.

ISO is non free garbage which is not helping the software ecosystem. Take ISO 8601, it is overly complicated, not implemented correctly most of the time as maintainers used a free draft and it does not actually solve anything properly. (For example you cannot represent wall clock which is a problem for dates in the future as time zone changes).

I also worked with mp4 in the past only to realize that the ISO was not enough as Apple had some changes in their stack.

  • > For example you cannot represent wall clock which is a problem for dates in the future as time zone changes

    While I understand this particular frustration, in my book it is a feature. The critique usually devolves into hypothetical scenarios, e.g. changes to DST. "I want to be able to specify 14:00 in four years in Absurdistan local time, whatever that is in relation to UTC, but cannot!" is a common critique of ISO 8601. However, if you go a little bit further with hypotheticals, Absurdistan might add some overseas territories, might join some alliance and change timezone/DST, etc.

    When you think about the problem statement, definition of "local time" itself may change, therefore it is impossible to specify "local time" in the future without *exhaustively* defining all possible changes.

    So you either define number of atomic ticks (TAI) in the future and resolve local time at the time of use or specify some static time and resolve it to local time at the time of use.

  • Are there any alternatives to iso8601? My only beef with it is that there seems to be more than 1 way to represent a few things. Didn't know about the wall clock issue.

    I wonder if the new JS temporal API handles that.. they went pretty deep.

Pay-to-read standards like those of the ISO actively hinder human progress. Please don't encourage this behaviour.

  • With C++, the latest draft of the standard is made available for free [0]. My understanding is that the final draft and the official standard are more or less same w.r.t. their material content. I imagine the draft standard for OIDC is also available somewhere.

    [0] https://en.cppreference.com/w/cpp/links#C.2B.2B_standard_doc...

    • Almost every standard is like this. If you just want to implement, you take the last draft that's public. The process between that draft and a standard is an extensive review and editing process to ensure that wording is exactly precise, patent claims against the standard are void or invalid, and that there are no other problems. That stuff takes time and money.

    • > My understanding is that the final draft and the official standard are more or less same w.r.t. their material content.

      Details matter. So not so open after all?

  • Not every MUST be black and white. It is okay to admit things are grey. Those engineers are not actively harming human progress by creating human progress, and its insane to say that

Identity provisioning is an abomination that shouldn't have been invented. I used to be a fan back in mid-'00s, self-hosting an OpenID server, without realizing how the whole concept is so fundamentally flawed.

Identity is an innate and inalienable property of individual, not something that anyone else (another person, company/website, government or whoever else) can "provide". They can merely attest by providing a credential, by e.g. issuing a passport.

At least Webauthn got this right.

  • But does this not make the assumption that the Identity being provisioned is exactly you and only you? I've always seen these identities as my pseudonym on some identity provider and use them in that manner.

    I suppose I've used some identities in enough places that it would be hard to deny to certain entities that the identity was mine, but even in that case it's a small subset of entities which have seen the identity that could prove that it's me.

  • Is there a practical consequence to this distinction between attestation and provisioning, or is it purely philosophical?

    • For me personally, it's primarily philosophical - I don't want to be defined by someone else, only confirmed that it's, indeed, me.

      However, there are practical consequences. There are plenty of stories how people got their Google/Facebook/Apple/... accounts blocked; or domain name lapsed, for self-hosting folks - and thus lost "their" "identity" (quotes for a lack of better words).

      One can back up their credentials and attestations - losing a password (or a keypair) is preventable, since it's all first-party. One cannot back up a third-party service.

      3 replies →

Is there an independent OIDC issuer (outside of Google, MS or Apple) left one can create an account on?

I wanted to open an account on Tailscale without using my Github account the other day but couldn't.

I believe openid.net and Ubuntu One were providing this service a long time ago but they discontinued it.

  • There at least a few others left, https://gitlab.com is one I regularly use.

    Sadly the amount of money you need to spend on security & support for such a service does make offering such a service (particularly for free) not viable for smaller entities, there are some big economies of scale to make these things viable that work particularly well if you can also get big companies to pay for commercial offerings.

OpenID connect is a rather simple protocol. I was able to understand most of it in about a day by reading the specs (https://openid.net/specs/openid-connect-core-1_0.html). For anybody that's interested and doesn't want to read these specs, I've written a comprehensive tutorial on how to implement a client for OpenID using simple HTTP requests (https://spapas.github.io/2023/11/29/openid-connect-tutorial/).

It's using python to do the work but it should be straight forward to implement it in anything you want. Most of the complex stuff is related to decoding and checking the JWT tokens.

I'm using exactly this hand written client on a production project to authenticate with keycloak for like a year and everything's working perfectly!

PS: I know that there are way too many ads to my site. Unfortunately I haven't found the time to properly configure google ads and haven't found anything better :( Use an ad-blocker to read it.

  • Very interesting and good write-up.

    PS. Be cautious with using subjective words like "simple". It can be really off-putting as a reader if you think something is difficult and the author claims it's simple.

    • everything is relative. I believe it’s “simple” compared to other standards like SAML.

      I agree it can be a little intimidating for a novice in the space!

  • A great tutorial.

    I'm still not convinced that OIDC is easy. Keycloak hides enormous complexity and that's not because the developers were bored. One example is the huge number of settings for various timeouts: SSO timeouts, client timeouts, various token timeouts.

The whole monetization and organization around ISO standards feels super shady.

One lesser known hack is to search the friendly Estonian site [1] for a cheaper version of the standard - they often create their own versions of the standards which much pretty contain the exact same content as the original. Unfortunately, in this case, it seems they only are offering the actual standard at a similar price [2]. Sad dog face.

It could be worthwhile to monitor the website to see if they release their own version for a better price in the future. Usually, their prices are ~10% of the original price (one more data point that Estonia does cool stuff).

We deal with the rather shady standardization organizations quite a lot as we work in medical device compliance [3]. I've heard all the usual arguments: "But standardization costs money!", "These organizations are doing good work!", etc., etc. No. I completely disagree. If something's a standard, that in my opinion makes it similar to a law - people should be able to follow it, and that requires people to freely access it. The EU Advocate General seems to agree [4]. And there are lots of standardizations which don't rely on shadily offering PDFs for money: ECMAScript and ANSI C come to mind, but the list goes on.

[1] https://evs.ee [2] https://www.evs.ee/en/search?OnlySuggestedProducts=false&que... [3] https://openregulatory.com/accessing-standards/ [4] https://openregulatory.com/maybe-eu-standards-are-becoming-f...

Making it an ISO publication makes it an ass cover for procurement. Nobody got fired for demanding compliance with a pile of ISO standards after all.

Review: Mike Jones is one of the 3 members of the OIDC working group. He celebrates the publication of the spec as a publicly accessible standard (PAS) and has worked to include the erratas so that it is a complete document.

Congratulations to the achievement that is OIDC!

  • > publicly accessible standard (PAS)

    ISO standards are not publicly accessible standard. OIDC was always publicly accessible, now I dunno. ISO OIDC is and will be meaningless. ISO is a racket to keep people with no useful skills employed and the organization should be fined for the tax payer money they took and should be sanctioned by every nation in the world.

I fear this will be as adopted as PDF/2.0, the first non-free (and ISO) specification of the PDF format.

Hopefully the draft versions are freely available somewhere and closely resemble the final product^W specification.

The link through to PAS says:

> Publicly Available Specifications have a maximum life of six years, after which they can be transformed into an International Standard or withdrawn.

So is the plan to transform this into a standard after that period? Was a PAS application chosen because (it sounds like) it goes through the standards body quicker? So this gives an intermediate ISO seal of validation until this becomes a full International Standard?

ISO should be fined and abolished, and the money it took from taxpayers should be recuperated from the frauds that ran it.

Guess great way to spend the last bits of money of the yearly department budget so you won't loose it in 2025

I search 2 years ago a working server and not found ;(

  • Have you tired https://www.keycloak.org/?

    • In my experience KeyCloak can be a very mixed bag.

      And if you are especially unlucky might be so painful to use that you could say it doesn't work.

      But for a lot of use cases it does work well.

      In some context it might safe you not just developer month, but years (e.g. certain use cases with certification/policy enforcement aspects).

      But it also can be a story from running from one rough edge into another where you project lead/manager starts doubling or tripping any time estimate of stories involving Keycloak.

      E.g. the REST API of Keycloak (for programmatic management of it) is very usable but full of inconsistencies and terrible badly documented (I mean there is a OpenAPI spec but it's very rarely giving you satisfying answers for the meaning of a listed parameter beyond a non descriptive 3 word description). (It's also improving with each version.)

      Similar multi tenancy can be a pain, depending on what exactly you need. UMA can be grate or a catastrophe, again depending on your specific use cases. SSO User management can just fine or very painful. There is a UI for customizing the auth flow, but it has a ton of subtle internal/impl. detail constraints/mechanics not well documented which you have to know to effectively use it so if you can't copy past someones else solution or only need trivial changes this can be a huge trap (looking like a easy change but being everything but that)...

      The build in mail templates work, but can get your mail delivery (silently) dropped (not sure why, maybe some scammers used Keycloak before).

      The default user facing UI works but you will have to customize it even if just for consistent branding and it uses it's own Java specific rendering system (and consistent branding here isn't just a fancy looks goal, one of the first things though on scam avoidance courses for non technical people is, that if it looks very different it's probably a scam and you shouldn't log in).

      I think Keycloak is somewhat of a no brainer for large teams, but can be a dangerous traps for very small teams.

      2 replies →

  • We run Apereo CAS pretty successfully. Originally to use the CAS protocol, but now that CAS (the protocol) has been deprecated, we're slowly migrating to OIDC. One sort of weird note about Apereo CAS, OpenID Connect can return data in two format, nested and flat. CAS is the only server I've ever worked with, that defaults to nested. Almost no clients supports this, but the server can be reconfigured to use flat.

    KeyCloak is also very good, but I'd run is as a container due to the quick release/update cycle. If I had to do our infrastructure over, I'd probably go for KeyCloak, just because it's the most used.

  • open source: keycloak connect2id dex gluu

    closed source self hosted: adfs

    hosted: okta (auth0) google microsoft github amazon

    these are just the ones that were viable 2 years ago

ISO means It's so overcomplicated i think.

This is water under the bridge, but it's a bit ironic to me that I had to create an account to comment here. If OpenID 2.0 had succeeded, I would be commenting with my identifier https://self-issued.info/, which remains a valid, verifiable OpenID 2.0 identifier. You can still comment on my blog with OpenID 2.0 identifiers today. Have at it!