← Back to context

Comment by forty

3 years ago

To be fair, any JSON implentation is going to have a practical limit on the key size, it's just a bit more random and harder to figure out :)

If you mean limited by available memory, then sure but that does not apply just to key size. If you mean something else, could you elaborate?

  • Another reason to have a limit well below the computer's memory capacity is that one could find ill-formed documents in the wild, e.g., an unclosed quotation mark, causing the "rest' of a potentially large file to be read as a key, which can quickly snowball (imagine if you need to store the keys in a database, in a log, if your algorithms need to copy the keys, etc.)

  • I assume JSON implementations have a some limit on the key size (or on the whole document which limits the key size), hopefully far below the available memory.

    • I assume and hope that they do not, if there is no rule stating that they are invalid. There are valid reasons for JSON to massive keys. A simple one: depending on the programming language and libraries used, an unordered array ["a","b","c"] might be better mapped as a dictionary {"a":1,"b":1,"c":1}. Now all of your keys are semantically values, and any limit imposed on keys only makes sense if the same limit is also imposed on values.

      1 reply →

  • I guess it is about different implementations of some not properly formalized parts of the JSON spec.

    There was also an article here some time ago but I cannot find it right now.