Comment by seertaak
3 days ago
That's part of the answer, but there's a bit more to it IMO.
The syntax is a bit weird; python, swift, rust, and zig feel more parsimonious.
I absolutely love multimethods, but I think the language would have been better served by non-symmetric multimethods (rather than the symmetric multimethods which are used). The reason is that symmetric multimethods require a PHD-level compiler implementation. That, in turn, means a developer can't easily picture what the compiler is doing in any given situation. By contrast, had the language designers used asymmetric multimethods (where argument position affects type checking), compilation becomes trivial -- in particular, easily allowing separate compilation. You already know how: it's the draw shapes trick i.e., double-dispatch. So in this case, it's trivial to keep what compiler is "doing" in your head. (Of course, the compiler is free to use clever tricks, such as dispatch tables, to speed things up.)
The aforementioned interacts sensitively with JIT compilation, with the net outcome that it's reportedly difficult to predict the performance of a snippet of Julia code.
Just to clarify the above:
1. I use the term "performance" slightly vaguely. It's comprised of two distinct things: the time it takes to compile the code, and the execution time. The issue is the compilation time: there are certain cases where it's exponential in the number of types which could unify with the callsite's type params.
2. IIRC, Julia compiler has heuristics to ensure things don't explode for common cases. If I'm not mistaken, not only do compile times explode, but certain (very common) things don't even typecheck. There's an excellent video about it by the designer of the language, Jeff Bezanson -- https://www.youtube.com/watch?v=TPuJsgyu87U . Note: Julia experts, please correct me if this has been fixed.
3. The difficulty in intuiting which combinations of types will unify at a given callsite isn't theoretical; there are reports of libraries which unexpectedly fail to work together. I want to qualify this statement: Julia is light years ahead of any language lacking multimethods when it comes to library composability. But my guess is that those problems would be reduced with non-symmetric multimethods.
4. The non-symmetric multimethod system I'm "proposing" isn't my idea. They are referred to variously as encapsulated or parasitic multimethods. See http://lucacardelli.name/Papers/Binary.pdf
I have huge respect for Jeff Bezanson, for the record!
> The syntax is a bit weird
In what way? It's more-or-less the same syntax as Ruby and Elixir, just with different keywords. Like as much as I love Zig, Zig's syntax is way weirder than Julia's IMO (and none of 'em hold a candle to the weirdness of, say, Erlang or Haskell or Forth or Lisp).
First, let's distinguish between two types of syntactic constructs: null and left denominations. (Terminology borrowed from Pratt parsers.) Null denominations can exist on their own, left denominations can't -- they are inherently chained (eg arithmetic expressions, statements in a block, or elements of a tuple), and allow a succinct, infix notation for variable-length constructs (no lispy parentheses hell).
Second, null denominations usually introduce names -- whether for variables, types, functions, lifetimes, macros, etc. One exception to this are free-standing value expressions (a bit weird; less so when they're the last expression in a block indicating the value returned by it). Another other exception would be directive type constructs - eg directives to import names from another module, directives to give hints to the compiler, etc. The last two exceptions are the most common ones: variable assignment and function invocation.
The golden rule of good language design, as I see it, is this: null denominations must begin with a fixed and unique token. The only permissible exceptions should be for assignment and function invocation; exceptions which exist because those use-cases appear so often in a typical program that requiring a prefix would be insufferable.
Julia breaks this rule for global variables. (Fair enough, Python also commits this error, but it's a mistake and a source of bugs!) But wait, Julia also has "const" and "local" binding constructs, where it follows the golden rule -- but now your syntax isn't consistent. So now you need to keep in your head these nuances -- and know the difference between a soft and hard scope -- when you want to write a function which modifies a function using macrology.
(As a point of taste on the choice of prefix token: introduction of variables through "local" is just as weird as C++'s "auto" -- and at least Bjarne Stroustroup had an excuse for that choice. Anyone who introduces a global variable in a local scope should be punished imho, so there's no need to say "this is a local variable", it's obvious from the fact that the name is introduced inside a function. Instead, my personal preference is to introduce constants through "let", and variables through "var". The former is well-known to anyone numerate, and the latter is ubiquitous in software engineering. Both read well; they're as close as possible as you can get to constructs in English.)
Julia breaks the golden rule again with its succinct, Mathematica-style notation for function definition. I get that it wants to appeal to Mathematica users, but Python already proved you don't need to do that. This is a programming language; brainy types, like mathematicians and physicists, aren't going to be flummoxed by an unfamiliar notation for function definition, or irritated by having to type a few extra characters.
I mention macrology; it's not just that. Let's say you want write a syntax highlighter -- you need to take into account all that weirdness. If null denominations have a fixed & unique prefix, parsing is easy-peasy. Want to add a capability to "inline" HTML code within Julia, react-style? You're going to run into similar issues. And so on...