C and C++ are statically compiled languages. They're strongly typed. They make a point to be extremely fucking type savvy -- Except when they're just out to fuck with you, then they're just lazy elitist type snobs.
Consider the following:
int i = 4;
int d = -2;
unsigned int k = 10;
if ( k > i ) puts( "k is greater than i." );
else puts( "k is less than or equal to i." );
if ( k > d ) puts( "k is greater than d." );
else puts( "k is less than or equal to d." );
Which results in the following:
k is greater than i.
k is less than or equal to d.
That's right, 10 is less than or equal to -2. Wait, what?
The type promotion rules specify that the signed value will be promoted to an unsigned value during the compare, so on a 32 bit machine -2 is treated as 4,294,967,294... which really is greater than 10.
No sane person would ever expect a statement like this to return true if a
is positive number, and b
if ( a < b ) return true;
The problem is that the folks who created C were insane and lazy.
It's entirely possible to make the damn compare signs ACTUALLY WORK.
(Even my "toy" scripting language has working comparators.)
The compiler KNOWS the types are dissimilar, so it KNOWS it's about to fuck you over. It could generate extra code to make their language constructs actually work, but: Lazy.
For instance, when comparing values of dissimilar signs they could generate an additional check to ensure the signed value (b) is non negative:
if ( a < b && b > 0 ) return true;
Down in the ASM, they'd only have to insert one CMP / JMP instruction pair after the first compare to make such comparisons work -- Any computed values would be in the registers anyhow. If you've ever seen how much extra machine code is generated by doing pointer math, then you'd realize one compare & jump instruction pair is fucking trivial, even to the most "efficiency" focused fools.
In truth, this is an edge case in the language that the original implementers overlooked; Many years ago, after this trap bit a few folks enough times to complain and want it changed, the language designers just stuck their noses in the air and said: "Deal with it."
Even though I've known about this little pitfall since my "Hello World" days, I still chased down a bug for three hours that was staring me in the face the whole time -- Until I looked up the types of the vars, and facepalmed.
I just got so spoiled coding in other languages not created by assholes that I got used to having basic boolean logic that's actually logical.
Also, The Standard doesn't say what type a "char" will be -- It could be a signed or unsigned byte, it's arbitrarily up to the compiler... How do you write correct comparison logic if you CAN'T know the fucking type?! "Just don't compare chars" -- ugh, I'm getting sick of this "Deal with it." bullshit.