Comment by crabbone
17 hours ago
I don't believe monads are a "heavy handed abstraction" and that's what prevents people from prototyping in Haskell.
What really prevents people from writing in Haskell at a reasonable speed is the poor language design. Programming languages are supposed to aid in reading by emphasizing structure. It's important to emphasize that a particular group of "words" constitutes a function call, or a variable definition, or a type definition -- whatever the language has to offer.
Haskell is a word salad. Every line you read, you have to read multiple times, every time trying to guess the structure from the disconnected acronyms. It belongs to the "buffalo buffalo buffalo buffalo" gimmick family. This is a huge roadblock on the way to prototyping as well as any other activity that implies the ability to read code quickly. And then it's also spiced by the most bizarre indentation rules invented by men.
This is not at all a problem with eg. SML or Erlang, even though they are roughly in the same category of languages.
Haskell would've been a much better language if it made its syntax more systematic and disallowed syntactical extensions s.a. introduction of user-invented infix operators, overloading of literals (heaven, why???) and requiring parenthesis around function arguments both for definition and for application. The execution model is great, the typesystem is great... but the surface, the front door to all these nice things the language has is just some amateur level nonsense.
* * *
As for the upsides of using languages from the Lisp family for practical problems... I don't find (syntax-rules ...) all that exciting. I understand this was an attempt to constrain the freedom given by Common Lisp macros, and I don't think it worked. I think it's clumsy and annoying to deal with. The very first time I tried to use it, I ran into its limitations, and that felt completely unjustified. To prototype, you want freedom of movement, not some pedantry that will stand in your way and demand you work around it somehow.
The absolute selling point, however, is SWANK. Instead of editing the source code, you are editing the program itself, that can be interacted with in points of your choosing. I don't know of any modern language that offers this kind of experience. I think, even still in the 80s, this approach to programmers interacting with computers was common. At school, we had terminals with some variety of Basic, and it worked just like that: you type the program and it instantly shows the effect of your changes. Then, there was also Forth, which also worked in a similar way: it felt like you are "talking" to the computer in a very organized and structured way, but real-time.
Most mainstream languages today sprouted from the idea of batch jobs, where the programmer isn't at the keyboard when the program runs. They came with the need to anticipate and protect the programmer from every minor mistake they might've easily detected and fixed during an interactive session far, far in advance.
Whenever I think about writing in C, or Rust, or Haskell, I imagine being tasked with going to the grocery blindfolded: I'd need to memorize the number of steps, the turns, predict the traffic, have canned strategies for what to do when potatoes go on sale... I deeply regret that programming evolved using this evolution path, and our idea of what it means to program is, mostly, the skill of guessing the impossible to predict future, instead of learning to react to the events as they unfold.
Your criticism of Haskell is entirely subjective. There are lots of people, myself included, that like and prefer Haskell's syntax.
There aren't a lot of Haskell programmers, so "lots" is maybe an exaggeration.
I see OP's point. Haskell feels (or felt, I admit I haven't been keeping up the last 15 years) needlessly obtuse sometimes, like how people love to invent new infix operators all the time.
This is not what "subjective" means. You can't argue something is subjective because many people don't agree with an opinion.
When someone argues subjectivity (in a negative sense), they need to show that the opinion does not rely on facts, rather it's based on... nothing (feelings).
I offered a very easy way to numerically assess the negative impact of poor language design choices made by Haskell designers. It's not about what I "feel" about the language: in Java, you write three-words program, and you get, usually, a unique interpretation. In Haskell, you write a three-words program, and you get 9 (nine) possible interpretations. It's impossible for a human to examine nine interpretations simultaneously and figure out which of them are valid and might fit the context. So, reading a Haskell program takes longer and requires more effort than a Java program.
Of course, Haskell programmers find ways to adapt to their misfortune. They try to avoid pathological cases (eg. writing four-words programs, let alone five!), they memorize a lot of acronyms and non-typographical symbols that they later use to prune the search for a possible meaning of the program. They invent conventions on top of the bare language design that constrain the search space for possible programs to make their task easier.
It's absolutely possible that after layers of conventions and a long time spent memorizing various acronyms and symbols, Haskell programmers catch up to speed of programmers in other languages: after all, the superficial difficulties with the language might seem like a small price to pay for the access to the language's riches that lay beyond the surface. The language grammar rules cannot account for the entirety of the performance of the programmers who chose to write in the language.
This situation is very similar to the "universal" (claimed, but not in practice) mathematical language, which is extremely difficult to read, write, edit, typeset... yet the tradition of using it prevails and the overwhelming majority of mathematicians use, and prefer using the "universal" mathematical language even though much saner alternatives exist.
> Haskell is a word salad. Every line you read, you have to read multiple times, every time trying to guess the structure from the disconnected acronyms. This is a huge roadblock on the way to prototyping as well as any other activity that implies the ability to read code quickly.
I couldn't disagree more. Yes, there is more upfront work understanding Haskell code. But it's very dense. Once you understand the patterns, you can read it much quicker. Just like map/filter/fold are harder to understand then a for-loop, but once you do, you can immediately see what kind of iteration is applied. The for-loop can do all kinds of crazy index manipulation that you always have to digest from scratch.
> And then it's also spiced by the most bizarre indentation rules invented by men.
Again, quite surprised by this criticism. The rule is extremely simple: inner expressions must be indented more. You're free to decide by how much. That's why there are many "styles" out there. Maybe that's what you mean with bizarre. But it's not like the language is forcing weird constraints on you. If anything the constraints are too lax. Any other language with non-mandatory indentation allows that as well. In general, I really don't understand why not more languages do mandatory indentation. You only need curly braces and semicolons if you want the option to write a whole if/else/while/... statement in one line. But nobody does that.
> inner expressions must be indented more
Not to support the parent comment, which I disagree with, but If you use multi-line let-bindings, those require that you indent not just more than the previous line, but as much as the first token after the let keyword on the previous line. It’s a very strange rule, all the more surprising because it’s inconsistent even with the rest of the language. It is totally avoidable if you, like I think most experienced haskellers do, just prefer ‘where’, but people more familiar with procedural code usually lean into using ‘let’ everywhere because it feels more familiar.
I think the strange indentation used to be required in more places - I vaguely remember running into it a lot more when I started with Haskell 20 years ago, but that was also just when I was new to the language. These days I just keep ‘let’ to a bare minimum, so it doesn’t bother me. One thing that made Elm frustrating was that it disallowed ‘where’ clauses, forcing you to deal with this weird edge case all the time.
So you want to line the equals signs up or similar?
vs.
2 replies →
> I couldn't disagree more
[proceeds to agree on all points]
Not even sure what to tell you... Have more introspection?
> It's important to emphasize that a particular group of "words" constitutes a function call, or a variable definition, or a type definition -- whatever the language has to offer.
Syntax highlighting? Please take a look at https://play.haskell.org/
I am completely baffled by this comment. Are you missing the parenthesized function calls by any chance? If so then I can relate a bit.
No, it's not syntax highlighting.
For background: my first time in college, I was studying typography. An integral part of this trade is figuring out what is easier for people to read by answering questions s.a. what is the best line length, what number of columns per page is the best, what number of ascent elements per font face is the best, considering letter frequencies and coincidence and so on.
It also comes with the editing part, as in the trade of taking a manuscript (a text intended to be published) and making sure that the text meets certain reader expectations in terms of consistency, clarity, structure. This, obviously, includes the use of punctuation, but it's more about the language structure, things like adjectives order or anaphora usage etc.
Programming languages can be judged using the same rules, because, in the end of the day, we read them and need to interpret them. People have particular strengths and weaknesses when it comes to reading: we can remember the anaphora's anchor for only so long, we can hold only so many "variables" in fast-to-access memory, we only can do so many levels of adverb phrase nesting and so on.
Haskell was designed by someone completely oblivious to human abilities to read. It's very demanding and straining when it comes to extracting structure from text in the same way how, in English, you'd struggle to extract structure from so-called "garden path" sentences, because it's intentionally obfuscated. I don't believe Haskell is intentionally obfuscated, instead, I attribute the poor performance to the lack of awareness on the part of the author.
To convey the same point by means of example: Haskell is almost uniquely bad in that given a program
the programmer can't tell if the program is actually A(B, C), or B(A, C), or C(A, B), or A(B(C)), or A(C(B)), or (A(B))(C), or (B(C))(A), or (B(A))(C), or (C(B))(A).
There's absolutely no reason a language should offer these kinds of puzzles, especially in a very large quantity as Haskell does. Removing this "feature" would make the language a lot easier to work with.
In Haskell it's only ever one of (A(B)(C) or (B(A)(C), and you can tell which based on which characters B is made up of. If B starts with one of !#$%&*+./<=>?@\^|-~` it's the second situation, otherwise it's the first.[0] All functions are unary in Haskell so A(B, C), B(A, C) and C(A, B) can never actually happen. The cases where it looks like A(B(C)), etc. are happening are actually cases of (B(A)(C), e.g. f $ g is a (B(A)(C) case where B=$. So the basic syntax of Haskell is actually very simple and consistent, but due to lazy evaluation the functions can affect control flow much more than in other languages.
0: OK, there are some additional non-ASCII Unicode symbols, but everything but string literals should be kept ASCII IMO.
> the programmer can't tell if the program is actually
What do you mean, "can't tell"? If I see this in Python
how do I know which of your 9 it means? Well, I'm a Python programmer so I know that it means
which is the function A applied to B, which returns a function that gets applied to C. If you're a Haskell programmer you know that it means the same thing.
I grant you that it is odd to those who are unfamiliar and it took me quite a while to get used to it, but it's much better to write that way in Haskell when writing programs that use higher-order functions.
Mmm.I think I understand where you are coming from. You can write incomprehensible code in Haskell very easily and I agree that some people tend to write Haskell in a way that is easy when writing but very hard during reading.
But that is a choice. I prefer not using complex function compositions and the lenses due to this, split complex expressions into a bunch of let bindings etc..
So you also can write very readable code in Haskell.
> (syntax-rules ...) The very first time I tried to use it, I ran into its limitations
syntax-case is the general purpose construct to use. syntax-rules is a restricted, easy-things-should-be-easy construct.
https://www.scheme.com/tspl2d/syntax.html
You don't need syntax case to do advanced things though. Alex shinn's match.scm uses all the dirty syntax-rules trick.
It is pretty awful to write things like that.
It's just not good because you need to work around its limitations, whatever its purpose is. Not good for prototyping because it's the red tape you need to cut to get work done. Red tape isn't, in general, a bad thing, but when it comes to prototyping it is.
I think most people misunderstood syntax rules. It was not meant as the macro system for scheme. It was meant as the template macro system everyone could agree on, while leaving the more powerful low level macro systems to the implementations. Syntax case, or explicit/implicit renaming or syntactic closures or what have you.
1 reply →
From your last paragraph, I am curious which languages / paradigms you advocate for. Sorry it wasn't clear to me except that you like SWANK, which I'm not familiar with.
I wish there was some sort of a single metric that would allow measuring languages against each other and thus determining the best one. Unfortunately, there are multiple variables and the relationship between the variables is unclear. But, going totally with my gut feeling, some examples of good languages (in terms of ease of reading) include:
* Prolog (and, by extension, Erlang).
* Pascal.
* Java 5 and earlier (and Go, as it's almost a Java's twin).
These languages somehow manage to hit the sweet spot of enough system and enough diversity, few unexpected syntax constructs (eg. Pascal or Java have the "dangling else" problem, but it's manageable compared to the problems introduced by optional statement delimiters in Go or JavaScript for example). In every case, a programmer must program defensively against these sorts of language "pathologies".
To give some examples of questionable or outright bad design decisions:
* In Common Lisp (and Scheme as well as a number of similar languages) there's a problem with identifying the open parenthesis that will be closed by typing the closing parenthesis. Programmers must invent tools and techniques to manage this problem.
* In C++, there's a laughable (or, at least was, for a long time) rookie "whoopsie" when it comes to ">>" in templates vs infix operator. And the "solution" offered by the language designer makes you think they were just... lazy (add space).
Here are also examples of some (perhaps, accidentally) good decisions:
* Kebab-case in many Lisp family of languages. In Latin script, the position of the hyphen in the middle of the lower-case letter is a better choice then, eg. underscore (which is tutted to be a "not a typographic character"). Same reason why, eg. in traditional Hebrew hyphens are at the height of a capital letter (Hebrew doesn't have lower-case letters and the shape of letters is better suited for hyphens at the top rather than the middle).
* Clojure as well as Racket (afaik, deliberately) introduced more kinds of parenthesis-like delimiters to make it easier to guess which expression is being terminated by the currently typed delimiter.
* * *
Note that this is a "superficial" metric, because languages are also valuable for concepts they are able to express both in terms of program logic as well as program application to the hardware it manages; the ability to process, modify, generate, analyze the language automatically; the ability to constrain the language to a desired subset of all available operations... Incorporating all of these into a single metric seems like mission impossible :)
Try Clojure with CIDER/nREPL (roughly similar to SLIME/SWANK).
>And then it's also spiced by the most bizarre indentation rules
Are you mixing tabs and spaces? Maybe an example here would help.
>overloading of literals (heaven, why???)
No, this is important, so that default strings don't to have to be something crummy. Even C++ got on this bandwagon.
>and requiring parenthesis around function arguments both for definition and for application.
??? Again, an example would be helpful. Usually the complaint with Haskell is that people don't use enough parenthesis.
>The execution model is great
...I thought lazy execution was widely agreed to be the worst part of Haskell.
> Are you mixing tabs and spaces? Maybe an example here would help.
This is not what "rules" means. Rules aren't about what I do. Rules are about what the language treats as legal or illegal. I don't write in Haskell at all because I don't like it and have no use for it, but Haskell rules don't change because of that, they are still mindbogglingly complex when it comes to telling the programmer if the next line is the right amount of space to the right or not. None of that complexity is necessary and could've been totally avoided if the language used statement delimiters.
> No, this is important, so that default strings don't to have to be something crummy.
My argument is that to get a little accidental convenience you sacrificed a huge amount of routine convenience. The mental load of having to distrust a string when you see it is just not worth the accidental convenience of writing a prepared statement and making it appear as if it was a string. In other words, you are the guy who traded a donkey for three beans, but the beans didn't sprout into a huge ladder that took you to the giant's castle. You just made a very watery soup and that was that.
> Again, an example would be helpful.
Look up the example I gave in the adjacent reply.
> I thought lazy execution was widely agreed to be the worst part of Haskell.
It's good because it's unique and, when it fits the purpose, it's useful for that particular purpose and neigh irreplaceable, because it is unique. It's worth having for the sake of research, to understand how languages can be designed and what tools or techniques can be discovered on this path. This is said from the perspective that Haskell is not the end product, but rather a research attempting to study how languages can work and what concepts they can develop.
> if the language used statement delimiters
I mean, it does. White-space sensitive syntax is entirely opt-in when you chose to omit delimiters. Here's an explicit delimiter example: