This is superb. Thank you for making it and licensing it MIT. I think this is a contender to replace the lexer within jank. I'll do some benchmarking next year and we'll see!
It looks like the key missing part which would be needed for a lexer is source information (bare minimum: byte offset and size). I don't think edn.c can be used as a lexer without that, since error reporting requires accurate source information.
As a side note, I'm curious how much AI was used in the creation of edn.c. These days, I like to get a measure of that for every library I use.
Interesting, I had to look up what EDN is. Important to note that EDN doesn't have a concept of a schema like JSON Schema.
This is a `map`, which bears semblence with a Json object. The following might look like an incorrect paylood, but will actually parse as valid EDN:
{:a 1, "foo" :bar, [1 2 3] four}
// Note that keys and values can be elements of any type.
// The use of commas above is optional, as they are parsed as whitespace.
One clever difference is that you can use namespaced keywords for instance no more doubt of what :username might mean, you could use :com.ycombinator.news/username as key.
If you worked in exclusively static typed languages it takes a while to grasp how convenient this is when you're mixing data from different sources.
One of a key design principles in EDN is to be exclusively data exchange format. Which is true even for JSON where json-schema is something that sits on top of JSON itself. Same goes to EDN - in Clojure there is clojure.spec that adds schema like notation, validation rules and conformation. https://clojure.org/about/spec , something like this could be implemented in other languages as well.
JSON doesn't have schemas either, JSON Schema is just a separate schema spec that happens to build on JSON, but you might be using for example Zod instead of that. Similarly systems that consume EDN can have various schema systems. For example spec or malli in the Clojure world. (Or you could be using Zod with EDN, etc).
I don’t wish to pick on this post, it looks quite well done. However, in general, I have some doubts about data formats with typed primitives. JSON, TOML, ASN.1, what have you. There’s very little you can do with the data unless you apply a schema, so why decode before then? The schema tells you what type you need anyway, so why add syntax complexity if you have to double check the result of parsing?
I think it depends what you will intend to do with the data (which is true for all of the formats that you mentioned); not everyone will do the same thing with it even if it is the same file. It might be helpful to know from other programs that do not know this schema to be able to parse the data (not always the case when using IMPLICIT types in ASN.1, which is one reason to use EXPLICIT instead, although it has avantages and disadvantages compared with IMPLICIT; however, in DER all types will use the same framing allowing the framing to be parsed even if the specific type cannot be understood by the reader), and can also be used in case the schema is later extended to use types other than the ones that were originally expected. (I prefer to use ASN.1 DER in my stuff, although JSON and other formats are also used by other formats that were made by someone else)
> It might be helpful to know from other programs that do not know this schema to be able to parse the data
OK that’s a really interesting question: if you’re interpreting a text without knowing what it’s about, having type information embedded in it could help clarify the writer’s intent? That seems reasonable. Have you done this?
There's a semantic error here; the name and age fields have been
swapped in the second element of the list. At some point, somebody
has to check whether :name is a string and :age is a number. If your
application is going to do that anyway, why do syntax typing?
You might as well just try to construct a number from "Wilma" at the
point where you know you need a number.
Obviously I have an opinion here, but I'm putting it out there in
the hope of being contradicted. The whole world seems to run on JSON,
and I'm struggling to understand how syntax typing helps with JSON
document validation rather than needlessly complicating the syntax.
Can the metadata feature be used to ergonomically emulate HTML attributes? It's not clear from the docs, and the spec doesn't seem to document the feature at all.
i feel hiccup indexed based magic is a common design pattern you see in early Clojure and its less common now (id say hiccup is the exception that lives on)
I think it would be better to not use Unicode (so that you can use any character set), and to use "0o" instead of "0" prefix for octal numbers. Also, EDN seems to lack a proper format for binary data.
I think ASN.1 (and ASN.1X which is I added a few additional types such as key/value list and TRON string) is better. (I also made up a text-based ASN.1 format called TER which is intended to be converted to the binary DER format. It is also intended that extensions and subsets of TER can be made for specific applications if needed.) (I also wrote a DER decoder/encoder library in C, and programs that use that library, to convert TER to DER and to convert JSON to DER.)
ASN.1 (and ASN.1X) has many similar types than EDN, and a comparison can be made:
- Null (called "nil" in EDN) and booleans are available in ASN.1.
- Strings in ASN.1 are fortunately not limited to Unicode; you can also use ISO 2022, as well as octet strings and bit strings. However, there is no "single character" type.
- ASN.1 does have a Enumerated type, although the enumeration is made as numbers rather than as names. The EDN "keywords" type seems to be intended for enumerations.
- The integer and floating point types in ASN.1 are already arbitrary precision. If a reader requires a limited precision (e.g. 64-bits), it is easy to detect if it is out of range and result in an error condition.
- ASN.1 does not have a separate "list" and "vector" type, but does have a "set" type and a "sequence" type. A key/value list ("map") type is a nonstandard type in ASN.1X, but standard ASN.1 does not have a key/value list type.
- ASN.1 does have tagging, although its working is difference from EDN. ASN.1 does already have a date/time type though, so this extension is not needed. Extensions are possible by application types and private types, as well as by other methods such as External, Embedded PDV, and the nonstandard
- The rational number type (in edn.c but the main EDN specification does not seems to mention it), is not a standard type in ASN.1 but ASN.1X does have such a type.
(Some people complain that ASN.1 is complicated; this is not wrong, but you will only need to implement the parts that you will use (which is simpler when using DER rather than BER; I think BER is not very good and DER is much better), which ends up making it simpler while also capable of doing the things that would be desirable.)
(But, EDN does solve some of the problems with JSON, such as comments and a proper integer type.)
This is a tagged literal that can be read by provided (if provided) custom reader during reading of the document. The result could be any type you want.
OK, this is possible, but it seems the type that ought to be a built-in type.
Also, if there is not a binary file format for the data then you will need to always convert to/from base64 when working with this file whether or not you should need to.
Furthermore, this does not work very well when you want to deal with character sets rather than binary data, since (as far as I can tell from the specification) the input will still need to be UTF-8 and follow the EDN syntax of an existing type.
From what I can understand from the specification, the EDN decoder will still need to run and cannot be streamed if the official specification is used (which can make it inefficient), although it would probably be possible to make an implementation that can do this with streaming instead (but I don't know if the existing one does).
So, the extensibility is still restricted. (In my opinion, ASN.1 (and ASN.1X) does it better.)
Yes, EDN is a textual format intended to be human-readable. There is also a format called Transit used to serialise EDN elements. Unlike raw EDN, Transit is designed purely for program-to-program communication and drops human readability in favor of performance. It can encode data into either binary (MessagePack) or text (JSON), but in both cases, it preserves all EDN data types and originates from the Clojure language.
> I am working on a format consisting of serialized B-tree. It is essentially a dictionary, but serialized
I had wanted something a bit similar; a serialized B-tree (or a similar structure) but with only a 'raw bytes' type, for keys and values (I will use DER for the values; I have my own library to work with DER already), and the ability to easily find all records whose key matches a specified prefix.
This is superb. Thank you for making it and licensing it MIT. I think this is a contender to replace the lexer within jank. I'll do some benchmarking next year and we'll see!
Wow, that is a greate news!) Thanks for looking at it from this perspective! There are some benchmarks already available in the project - https://github.com/DotFox/edn.c/blob/main/bench/bench_integr...
you can run it locally with `make bench bench-clj bench-wasm`
Let me know if I can do anything to help you with support in jank.
It looks like the key missing part which would be needed for a lexer is source information (bare minimum: byte offset and size). I don't think edn.c can be used as a lexer without that, since error reporting requires accurate source information.
As a side note, I'm curious how much AI was used in the creation of edn.c. These days, I like to get a measure of that for every library I use.
1 reply →
Oooo that’d be nice.
Interesting, I had to look up what EDN is. Important to note that EDN doesn't have a concept of a schema like JSON Schema.
This is a `map`, which bears semblence with a Json object. The following might look like an incorrect paylood, but will actually parse as valid EDN:
If one wants to exchange complex data structures, Aterm is also an option: https://homepages.cwi.nl/~daybuild/daily-books/technology/at...
Some projects in Haskell use Aterms, as it is suitable for exchanging Sum and Product types.
One clever difference is that you can use namespaced keywords for instance no more doubt of what :username might mean, you could use :com.ycombinator.news/username as key.
If you worked in exclusively static typed languages it takes a while to grasp how convenient this is when you're mixing data from different sources.
One of a key design principles in EDN is to be exclusively data exchange format. Which is true even for JSON where json-schema is something that sits on top of JSON itself. Same goes to EDN - in Clojure there is clojure.spec that adds schema like notation, validation rules and conformation. https://clojure.org/about/spec , something like this could be implemented in other languages as well.
JSON doesn't have schemas either, JSON Schema is just a separate schema spec that happens to build on JSON, but you might be using for example Zod instead of that. Similarly systems that consume EDN can have various schema systems. For example spec or malli in the Clojure world. (Or you could be using Zod with EDN, etc).
I don’t wish to pick on this post, it looks quite well done. However, in general, I have some doubts about data formats with typed primitives. JSON, TOML, ASN.1, what have you. There’s very little you can do with the data unless you apply a schema, so why decode before then? The schema tells you what type you need anyway, so why add syntax complexity if you have to double check the result of parsing?
I think it depends what you will intend to do with the data (which is true for all of the formats that you mentioned); not everyone will do the same thing with it even if it is the same file. It might be helpful to know from other programs that do not know this schema to be able to parse the data (not always the case when using IMPLICIT types in ASN.1, which is one reason to use EXPLICIT instead, although it has avantages and disadvantages compared with IMPLICIT; however, in DER all types will use the same framing allowing the framing to be parsed even if the specific type cannot be understood by the reader), and can also be used in case the schema is later extended to use types other than the ones that were originally expected. (I prefer to use ASN.1 DER in my stuff, although JSON and other formats are also used by other formats that were made by someone else)
> It might be helpful to know from other programs that do not know this schema to be able to parse the data
OK that’s a really interesting question: if you’re interpreting a text without knowing what it’s about, having type information embedded in it could help clarify the writer’s intent? That seems reasonable. Have you done this?
I can do a lot without applying schema at all. For that I only need handful of types defined in EDN specification and Clojure programming language.
Suppose you have the EDN text
There's a semantic error here; the name and age fields have been swapped in the second element of the list. At some point, somebody has to check whether :name is a string and :age is a number. If your application is going to do that anyway, why do syntax typing? You might as well just try to construct a number from "Wilma" at the point where you know you need a number.
Obviously I have an opinion here, but I'm putting it out there in the hope of being contradicted. The whole world seems to run on JSON, and I'm struggling to understand how syntax typing helps with JSON document validation rather than needlessly complicating the syntax.
4 replies →
Can the metadata feature be used to ergonomically emulate HTML attributes? It's not clear from the docs, and the spec doesn't seem to document the feature at all.
I'm not sure how the metadata syntax works, but you might not need it because you can do this:
I think you can use metadata to model html attributes but in clojure people are using plain vector for that. https://github.com/weavejester/hiccup
tl;dr first element of the vector is a tag, second is a map of attributes test are children nodes:
[:h1 {:font-size "2em" :font-weight bold} "General Kenobi, you are a bold one"]
Also Hickory。。
https://github.com/clj-commons/hickory
i feel hiccup indexed based magic is a common design pattern you see in early Clojure and its less common now (id say hiccup is the exception that lives on)
I think it would be better to not use Unicode (so that you can use any character set), and to use "0o" instead of "0" prefix for octal numbers. Also, EDN seems to lack a proper format for binary data.
I think ASN.1 (and ASN.1X which is I added a few additional types such as key/value list and TRON string) is better. (I also made up a text-based ASN.1 format called TER which is intended to be converted to the binary DER format. It is also intended that extensions and subsets of TER can be made for specific applications if needed.) (I also wrote a DER decoder/encoder library in C, and programs that use that library, to convert TER to DER and to convert JSON to DER.)
ASN.1 (and ASN.1X) has many similar types than EDN, and a comparison can be made:
- Null (called "nil" in EDN) and booleans are available in ASN.1.
- Strings in ASN.1 are fortunately not limited to Unicode; you can also use ISO 2022, as well as octet strings and bit strings. However, there is no "single character" type.
- ASN.1 does have a Enumerated type, although the enumeration is made as numbers rather than as names. The EDN "keywords" type seems to be intended for enumerations.
- The integer and floating point types in ASN.1 are already arbitrary precision. If a reader requires a limited precision (e.g. 64-bits), it is easy to detect if it is out of range and result in an error condition.
- ASN.1 does not have a separate "list" and "vector" type, but does have a "set" type and a "sequence" type. A key/value list ("map") type is a nonstandard type in ASN.1X, but standard ASN.1 does not have a key/value list type.
- ASN.1 does have tagging, although its working is difference from EDN. ASN.1 does already have a date/time type though, so this extension is not needed. Extensions are possible by application types and private types, as well as by other methods such as External, Embedded PDV, and the nonstandard
- The rational number type (in edn.c but the main EDN specification does not seems to mention it), is not a standard type in ASN.1 but ASN.1X does have such a type.
(Some people complain that ASN.1 is complicated; this is not wrong, but you will only need to implement the parts that you will use (which is simpler when using DER rather than BER; I think BER is not very good and DER is much better), which ends up making it simpler while also capable of doing the things that would be desirable.)
(But, EDN does solve some of the problems with JSON, such as comments and a proper integer type.)
> EDN seems to lack a proper format for binary data
The best part of EDN that it is extendable :)
#binary/base64 "SGVsbG8sIHp6bzM4Y29tcHV0ZXIhIEhvdyBhcmUgeW91IGRvaW5nPw=="
This is a tagged literal that can be read by provided (if provided) custom reader during reading of the document. The result could be any type you want.
OK, this is possible, but it seems the type that ought to be a built-in type.
Also, if there is not a binary file format for the data then you will need to always convert to/from base64 when working with this file whether or not you should need to.
Furthermore, this does not work very well when you want to deal with character sets rather than binary data, since (as far as I can tell from the specification) the input will still need to be UTF-8 and follow the EDN syntax of an existing type.
From what I can understand from the specification, the EDN decoder will still need to run and cannot be streamed if the official specification is used (which can make it inefficient), although it would probably be possible to make an implementation that can do this with streaming instead (but I don't know if the existing one does).
So, the extensibility is still restricted. (In my opinion, ASN.1 (and ASN.1X) does it better.)
3 replies →
I'm grateful for this! Love seeing EDN find its way into new places.
Very nice. Is there a plan to have an EDN writer in C as well?
Yes, plan is there but didn't have time yet. Most likely will be available next week
Wonderful. Looking forward to it.
A very impressinve implementation with SIMD and WASM!
[dead]
[dead]
Thanks for the link!
Yes, EDN is a textual format intended to be human-readable. There is also a format called Transit used to serialise EDN elements. Unlike raw EDN, Transit is designed purely for program-to-program communication and drops human readability in favor of performance. It can encode data into either binary (MessagePack) or text (JSON), but in both cases, it preserves all EDN data types and originates from the Clojure language.
https://github.com/cognitect/transit-format
> EDN also has no builtin 'raw bytes' type.
That was my complaint too.
> I am working on a format consisting of serialized B-tree. It is essentially a dictionary, but serialized
I had wanted something a bit similar; a serialized B-tree (or a similar structure) but with only a 'raw bytes' type, for keys and values (I will use DER for the values; I have my own library to work with DER already), and the ability to easily find all records whose key matches a specified prefix.