[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: RE: ~= vs. !=
- From: virgil@... (Virgil Smith)
- Date: Wed, 26 Nov 2003 18:27:27 -0600
> Last time I checked, the "~" character was the standard
> logical symbol for "not." At least it was in my
> statistics class (e.g. "A|~B").
...
> Whereas with the "~=" operator, at least this would
> be familiar to people who took logic courses.
Fine, according to "Contemporary Logic Design, by Randy H. Katz, The
Benjamin Cummings/Publishing Company Inc., copyright 1994", page 19 under
"Boolean Algebra"
"the complement (inversion, negation) of X as one of X',
_
X, !X, /X, or \X."
You will note that ! is in that list, but ~ is not (which honestly is an
oversight in this listing). However, I think the statement that '"~" is the
standard logic symbol "familiar to people who took logic courses"' is
incorrect through insufficient sampling.
Furthermore, I'm befuddled by the statement that != is a *special* case. My
personal frequent language switching is between C/C++ and Visual Basic which
is a continual flip between != and <>. What's more when switching languages
I personally consider the '=' character to be the first thing to watch out
for due to frequent switches between != and <>, = and ==, and from my
student Pascal/C days = and :=. The only strange thing about ~= IMO is that
I've never used a language that used that notation. Of course maybe I'm
just not specifically "script" language savvy enough for this crazy
discussion. We may as well be arguing about the 0 vs. 1 based array issue
or the use of ()'s for array access for all of the frequency of differences
between languages.
So, does anyone have references for their assertions of the frequency of !=
?