Ok some of these I understand but what the fuck. Why.
a >= b, is equivalent to, !(a < b)
I’m not sure if you really want to know, but:
greater than, smaller than, will cast the type so it will be
0>0
which is false, ofcourse.0>=0
is true.Now
==
will first compare types, they are different types so it’s false.Also I’m a JavaScript Dev and if I ever see someone I work with use these kind of hacks I’m never working together with them again unless they apologize a lot and wash their dirty typing hands with… acid? :-)
isn’t
===
the one that compare types first?I just tried on node and
0 == '0'
returns true
Not a JavaScript dev here, but I work with it. Doesn’t “==” do type coercion, though? Isn’t that why “===” exists?
As far as I know the operators “>=” and “<=” are implemented as the negation of “<” and “>” respectively. Why: because when you are working with sticky ordered sets, like natural numbers, those operators work.
Thus “0<=0” -> “!(0>0)” -> “!(false)” -> “true”
Correct me if my thinking is wrong though.
So will
null <= 0
returns true?Why tho
I know it’s a joke, but it’s an old one and it doesn’t make a lot of sense in this day and age.
Why are you comparing null to numbers? Shouldn’t you be assuring your values are valid first? Why are you using the “cast everything to the type you see fit and compare” operator?
Other languages would simply fail. Once more JavaScript greatest sin is not throwing an exception when you ask it to do things that don’t make sense.
Shouldn’t you be assuring your values are valid first?
Step 1: Get to prod
Step 2-10: Add features
Step 11: Sell the company before it bites you
As a professional bite-mean-for-other-guy taker (right now in Java) this hurts my feelings.
This one is one of my favourite JS quirks:
Wait wtf is happening there?
parseInt is meant for strings so it converts the number there into a string. Once the numbers get small enough it starts representing it with scientific notation. So
0.0000001
converts into"1e-7"
where it then starts to ignore thee-7
part because that’s not a valid int, so it is left with1
https://javascript.plainenglish.io/why-parseint-0-0000001-0-8fe1aec15d8b
I wrote an exam about this stuff yesterday.
In J’s equality is usually checked in a way that variables are casted to the type of the other one. “25” == 25 evaluates to truey because the string converted to int is equal to the int and the other way around.
You can however check if the thing is identical, using “25” == 25 which skips type conversion and would evaluate as false.
I assume the same thing happens here, null is casted to int, which gets the value 0.
This build on that humorously: https://www.destroyallsoftware.com/talks/wat
I had a fun bug where unit tests started failing on an upgrade. Turns out someone was returning undefined from a comparator. Wtf, people.
Huh?
Can someone explain this? I mean, the last result. Usually I can at least understand Javascript’s or PHP’s quirks. But this time I’m stumped.
JS null and undefined shenanigans
basically:
- bigger an lesser comparison types convert null to zero, so is zero bigger than zero? no
==
is fucky and to it null only equals undefined and undefined only equals null, so no- is zero bigger than or equal to zero? yeah
Ugh, thanks, of course. Stupid brain.
I’m starting to think JS maintainers have a thing against mathematicians
more likely against humans
My only thought here might be >= is usually the same as !< and maybe thats how it is defined in javascript and since < is false than >= == !false == true
Is there a JS joke out there that doesn’t have to do with type coercion?