Don’t know why this matters all that much, but if ANYONE can shed ANY light on “why” for the following behavior, that’d be great!
I ran across this when I tested my program and I expected errors because of a couple undefined’s but they went right through my [font=Courier New]if![/font]
So I decided to check it out. I started with a couple tests and got the expected answers:
trace ("undefined == 0 :" + ((undefined == 0)?"true":"false"));
trace ("undefined < 0 : " + ((undefined < 0)?"true":"false"));
trace ("undefined > 0 : " + ((undefined > 0)?"true":"false"));
//output:
//undefined == 0 :false
//undefined < 0 : false
//undefined > 0 : false
Ok, that’s fine.
But what about this?
trace ("undefined >= 0 : " + ((undefined >= 0)?"true":"false"));
trace ("undefined > 0 or undefined == 0 : " + (((undefined > 0)||(undefined == 0))?"true":"false"));
trace ("undefined <= 0 : " + ((undefined <= 0)?"true":"false"));
//output:
//undefined >= 0 : true
//undefined > 0 or undefined == 0 : false
//undefined <= 0 : true
So Number(undefined) should be NaN right?
There I ran into another strange happening:
trace ("Number.NaN == 0 : " + ((Number.NaN == 0)?"true":"false"));
trace ("Number.NaN >= 0 : " + ((Number.NaN >= 0)?"true":"false"));
trace ("Number.NaN <= 0 : " + ((Number.NaN <= 0)?"true":"false"));
//output:
//Number.NaN == 0 : false
//Number.NaN >= 0 : false
//Number.NaN <= 0 : false
trace ("NaN == 0 : " + ((NaN == 0)?"true":"false"));
trace ("NaN >= 0 : " + ((NaN >= 0)?"true":"false"));
trace ("NaN <= 0 : " + ((NaN <= 0)?"true":"false"));
//ouput:
//NaN == 0 : false
//NaN >= 0 : true
//NaN <= 0 : true
But still, [font=Courier New](Number(undefined) == NaN)[/font] returns false.
I’m prepared to find out that I’m the only one who cares, but if anyone else* is* interested, please post your thoughts!