In JavaScript, why is ~~ Infinity evaluated to 0?

Can anyone explain:

  • why double bitwise NOT for infinity returns 0?

    ~~Infinity //return 0
    
  • what happens under the hood?

  • What is the binary representation of Infinity in javascript?

+4
source share
1 answer

Since you are not working with the base bit number pattern in JavaScript.

You cannot make the equivalent of the following C code in JavaScript:

#include <inttypes.h>
#include <math.h>
#include <stdint.h>
#include <stdio.h>

int main(void) {
    double x = HUGE_VAL;
    uint64_t y = *((uint64_t *) &x);
    printf("%016" PRIx64 "\n", y);
    printf("%016" PRIx64 "\n", ~y);
    printf("%016" PRIx64 "\n", ~~y);
    return 0;
}

Fingerprints:

7ff0000000000000
800fffffffffffff
7ff0000000000000

As MDN notes :

[sic] 32 ( ), , .... , JavaScript.

... ( ) , 1, ( ).

11.4.8 ES5, :

11.4.8 NOT (~)

UnaryExpression : ~ UnaryExpression :

  • expr UnaryExpression.
  • oldValue be ToInt32(GetValue(expr)).
  • oldValue. 32- .

ToInt32(Infinity) - +0. ~ 0xffffffff. ~ .

, C-:

#include <inttypes.h>
#include <math.h>
#include <stdio.h>

int main(void) {
    double x = HUGE_VAL;
    uint32_t y = x;
    printf("%08X\n", y);
    printf("%08X\n", ~y);
    printf("%08X\n", ~~y);
    return 0;
}

:

00000000
FFFFFFFF
00000000
+5

Source: https://habr.com/ru/post/1680419/


All Articles