Comment by spb
11 years ago
"How often are you going to be multiplying sevens and cats? Soooo much."
Where the fuck does this meme of "fundamental type mismatches come up all the time in ordinary code" come from? What kind of defective system are people writing where it's normal for strings and numbers to be interpreted relationally (even accidentally)?
It sounds like the author is trying to demonstrate the significance of things like syntax transformations and format conversions (like transforming an email address to a mailto link), but that's nothing like "multiplying sevens and cats". It's manipulating things that aren't inherently incompatible - if anything, it's multiplying sevens and "7"s.
All these batshit insane contrived examples in asides like http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod... do is make code seem less accessible and comprehensible to anybody who isn't already intimately familiar with what's safe to interpret as sarcasm or hyperbole and what's not, which goes exactly contrary to the stated thesis of the article.
It can happen accidentally quite easily. Someone new to a codebase starts hacking in a feature and mistypes a variable as 'value' instead of 'values'. They fail to realize there's already a 'value' variable in the global namespace (perhaps it's a gigantic spaghetti code mess of a file). They don't have good test cases that exercise this exact line and fail to see the bug. Code ships to production, three months later the line runs and explodes.
Your example is quite good, although there are far more bulletproof ways than exhaustive test cases to make sure this doesn't happen.
On the web it's sort of all strings, so it's not hard to be in a situation where you have "length=7" & "cat=tabby" and get into a problem. Beyond that, many developers are in the habit of using primitives for everything, which makes these sorts of errors much more common.
Yes, it's possible to have strings for two different things. In what world are those strings going to be cross-evaluated?