If an array is believable in JavaScript, why is it not true?

I have the following code snippet:

if([]) {
  console.log("first is true");
}

consolesays first is truethat means []true. Now I wonder why this is:

if([] == true) {
  console.log("second is true");
}

and this:

if([] === true) {
  console.log("third is true");
}

no true. If the console registered first is truein the first fragment, it means that it []should be true, right? So why do the last two comparisons fail? Here is the violin.

+4
source share
3 answers

. ECMAScript 2015 Language Specification, , ; , . if , , . , :

if([]) { 
  ... 
}

[] .

, , ==, , , , , . 7.2.12 x == y :

7.2.12

x == y, x y - , true false. :

[...]

  1. Type (y) , x == ToNumber (y).

, y ( true) 1 ToNumber, , [] == 1 , :

  1. Type (x) - Object Type (y) - , , ToPrimitive (x) == y.

x toString , "" . ToPrimitive, :

if("" == 1) {
  ...
}

:

  1. Type (x) Type (y) - , ToNumber (x) == y.

, ToNumber "" 0, :

if(0 == 1) {
  ...
}

0 1, . , , - , true. Symbol() == true ({}) == true.

=== , return false, . - (), - , false.

+4

. , . . , .

if([]===true){
  console.log("third is true");
}

, . .

if([]==true){
  console.log("second is true");
}

, if [] boolean true.

0

You have a forced enforcement before you do a Boolean equity check with object types.

as well "" == false // <- trueor 0 == false // <- truework well

with object types it doesn't

null == false // <- false

so you better do

!!null === false // <- true or !![] === true // <- true

0
source

Source: https://habr.com/ru/post/1679278/


All Articles