Comment by ADeerAppeared
1 year ago
> sorry, not to nitpick, but I am really not a fan of one-based indexing
It is very funny how this is just the one sole criticism that always gets brought up. Not that other problems don't exist, but they're not very talked about.
Lua's strength as a language is that it does a lot quite well in ways that aren't noticeable. But when you compare things to the competition then they're quite obvious.
E.g. Type Coercion. Complete shitshow in lots of languages. Nightmare in Javascript. In Lua? It's quite elegant but most interestingly, effortlessly elegant. Very little of the rest of the language had to change to accomodate the improvement. (Excercise for the reader: Spot the one change that completely fixes string<->number conversion)
Makes Brendan Eich look like a total clown in comparison.
To be fair, having to work with 1 based indexing when you're used to 0 would be a frustrating source of off-by-one errors and extra cognitive load
As someone who's used Lua a lot as an embedded language in the VFX industry (The Games industry wasn't the only one that used it for that!), and had to deal with wrapping C++ and Python APIs with Lua (and vice-versa at times!), this is indeed very annoying, especially when tracing through callstacks to work out what's going on.
Eventually you end up in a place where it's beneficial to have converter functions that show up in the call stack frames so that you can keep track of whether the index is in the right "coordinate index system" (for lack of a better term) for the right language.
Oh that’s super interesting, where in the VFX industry is Lua common? I typically deal with Python and maybe Tcl (I do mostly Nuke and pipeline integrations), and I can’t think of a tool that is scripted in Lua. But I’ve never worked with Vega or Shake or what this is/was called
3 replies →
It absolutely is. For a language whose biggest selling factor is embeddability with C/C++, that decision (and I'm being polite) is a headscratcher (along with the other similar source of errors: 0 evaluating to true).
It's the perfect distraction: once you start accepting one-based, everything else that might be not quite to your liking isn't worth talking about. I could easily imagine an alternative timeline where lua was zero-based, but never reached critical mass.
4 replies →
The language actually started as a data entry notation for scientists.
I don't really blame Lua for that, though. 1-based indexing comes naturally to humans. It's the other languages which are backwards. I get why other languages are backwards from human intuition, I do. But if something has to be declared the bad guy, imo it's the paradigm which is at odds with human intuition which deserves that label, not the paradigm which fits us well.
1-based indexing is not any more natural than 0-based, it’s just that humans started indexing before the number 0 was conceptualized.
https://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD831...
Why numbering should start at zero. -- Dijkstra
5 replies →
Especially when one of the language's main purposes is to be embedded in applications written in other languages (which are predominantly zero based) - and so you tend to have a lot of back-and-forth integration between these two styles that can get confusing. Even from the C API side, for example, the Lua stack is one-based but addressed exclusively from the host language which is likely zero-based.
Don't forget that not equals is ~=, the horror.
The real gripes should be globals by default and ... nothing. Lua is wonderful.
"Don't forget that not equals is ~=, the horror."
I get you are taking the piss but ~= is just as logical as != being the symbols for: "not equals", if you've been exposed to some math(s) as well as ... well, ! means factorial, doesn't it?
Syntax is syntax and so is vocabulary! In the end you copy and paste off of Stack Exchange and all is golden 8)
! is commonly used as the unary not operator, so "a != b" makes sense as a shortcut for "!(a == b)". a not equals b.
1 reply →
For a language apparently inspired from Wirth, one would have expected <> (greater-or-lesser-than). But the real horror, to me, is Windows not letting one disable the so~called "dead keys" easily.
I'm more familiar with CSS than I am with Lua. The syntax for the former has a very different meaning[1].
[1] https://developer.mozilla.org/en-US/docs/Web/CSS/Attribute_s...
It's _ENV by default, it just defaults to _G.
Yeah, 36 years of Unicode and it's still not ‘≠’.
In unicode != on a keyboard
It is funny, isn't it? I always wonder how the language would be perceived had they gone with zero based indexing from the start.
I'm a big fan of Lua, including for the reasons you mention. I suspect the reason this one thing is always brought up is twofold: it's easy to notice, and it's very rare these days outside of Lua (if you consider VB.NET to be a legacy language, anyway). Other criticisms take more effort to communicate, and you can throw a rock and hit ten other languages with the same or similar issues.
>Makes Brendan Eich look like a total clown in comparison.
To be fair, Brendan Eich was making a scripting language for the 90's web. It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.
Most of the blame should go to Netscape management. They didn't give Eich much time, then burst in before he was done and made him copy a bunch of things from Java. (The new language, codenamed "Mocha" internally, was first publicly announced as "LiveScript", and then Sun threw a bunch of money at Netscape.)
IIRC, Eich was quite influenced by Python's design. I wish he'd just used Lua - would likely have saved a lot of pain. (Although, all that said, I have no idea what Lua looked like in 1994, and how much of its design has changed since then.)
https://news.ycombinator.com/item?id=1905155
If you don't know what Lua was like then, don't wish that I'd "just used Lua".
Other issues include Netscape target system support, "make it look like Java" orders from above, without which it wouldn't have happened, and more.
3 replies →
> It isn't his fault Silicon Valley decided that language needed to become the Ur-language to replace all application development in the future.
Which remains one of the most baffling decisions of all time, even to this day. Javascript is unpleasant to work with in the browser, the place it was designed for. It is utterly beyond me why anyone would go out of their way to use it in contexts where there are countless better languages available for the job. At least in the browser you pretty much have to use JS, so there's a good reason to tolerate it. Not so outside of the browser.
Wasn't the language he was making for the web Scheme?
No, Scheme was defined in 1975.
2 replies →
> To be fair, Brendan Eich was making a scripting language for the 90's web.
He was, and he doesn't deserve the full blame for being bad at designing a language when that wasn't his prior job or field of specialization.
But Lua is older so there's this element of "it didn't need to be this bad, he just fucked up" (And Eich being a jerk makes it amusing to pour some salt on that wound. Everyone understands it's not entirely serious.)
"Silicon Valley" is not an actor (human or organization of humans) that decided any such thing. This is like saying a virus decides to infect a host. JS got on first, and that meant it stuck. After getting on first, switching costs and sunk costs (fallacy or not) kept it going.
The pressure to evolve JS in a more fair-play standards setting rose and fell as browser competition rose and fell, because browser vendors compete for developers as lead users and promoters. Before competition came back, a leading or upstart browser could and did innovate ahead of the last JS standard. IE did this with DHTML mostly outside the core language, which MS helped standardize at the same time. I did it in Mozilla's engine in the late '90s, implementing things that made it into ES3, ES5, and ES6 (Array extras, getters and setters, more).
But the evolutionary regime everyone operated in didn't "decide" anything. There was and is no "Silicon Valley" entity calling such shots.
> "Silicon Valley" is not an actor (human or organization of humans) that decided any such thing.
Oh come on, you understand full well that they're referring to the wider SV business/software development "ecosystem".
Which is absolutely to blame for javascript becoming the default language for full-stack development, and the resulting JS-ecosystem being a dysfunctional shitshow.
Most of this new JS-ecosystem was built by venture capital startups & tech giants obsessed with deploying quickly, with near-total disregard for actually building something robustly functional and sustainable.
e.g. React as a framework does not make sense in the real world. It is simply too slow on the median device.
It does, however, make sense in the world of the Venture Capital startup. Where you don't need users to be able to actually use your app/website well. You only need that app/website to exist ASAP so you can collect the next round of investment.
1 reply →
I don't think JavaScript will replace all application development in the future. WebAssembly will displace JavaScript. With WebAssembly you can use whatever language you like and achieve higher performance than JavaScript.
One based indexing is also the one thing every beginners course has to hammer into people, as its so non-intuitive until you know about pointers.
Honestly, it might almost act as a "honeypot" to give people a convenient target to complain about, which makes it easier for the rest of the language to get viewed as a whole rather than nitpicked. Sometimes I think people like to have at least one negative thing to say about something new they learn, whether it's to show that they understand enough to be able to find something or it's to be "cool" enough not to just like everything.
that would be a galaxy brain move on the part of lua.
To be clear, I'm not claiming that I think this was the original intent for implementing array indexes like this; I'm just theorizing that this might explain why this ends up being such a fixation when it comes to discusses downsides of Lua.