On Sep 8, 2013, at 5:27 PM, Michael Richter <email@example.com> wrote:
You are arguing about a piece of mathematical logic (which is what a computer programming language is, or at least should be) by referring to usage in natural language. This is not an argument that bears scrutiny for longer than five seconds. Think, carefully, about what else would have to change to make things match natural language. (Hint: "for" and "while" aren't used intuitively either. Nor is "or". Nor is "do". Nor is "if" for that matter.) Then maybe you'll stop going down this ridiculous path.
I think that's a bit unfair .. the whole CONCEPT of high-level languages is to provide a bridge between actual computer architectures and abstractions that are more easily handled by humans, and part of that is a syntax that, by it's very nature as syntax, is related to human language. Something like "if .. then .. else" might not be English, but it is designed to parse in a similar manner, and SUGGEST it's meaning as a mnemonic assist (of course, this is more applicable if English is also your native language…). This doesn't of course mean that a computer language is the same as a natural language (think the ghastly mess of AppleScript), and invariably there will be constructs that differ subtly from a casual reading of the text. The "and .. or" construct falls into this category, but it's a testament to Lua that (a) it has very few of theses and (b) the few it has are, once mastered, exceedingly useful and can be justified by the need to keep the language small.