Differences in performance and memory between C # and Javascript?

We have a winforms C # application that models a 3D globe and the state of the world using a large number of object instances, float [] arrays and object references to represent the state of the world and relations between objects.

We were asked to move this software to the Internet and redefine it in Javascript.

I understand that C # falls into its own code, but it seems that in recent years, Javascript has also had great success.

I am wondering if there is any general information or a comparison of how Javascript pays performance and memory reasonably for handling raw data with objects and arrays compared to .NET or other languages ​​that run with native performance?

+4
source share
1 answer

Short answer

If you are an experienced C # developer and a novice JavaScript developer, your C # will certainly be faster. If you own both, then your C # will probably be faster, but the difference may not be what you thought it would be - it is all very specific to the program.

Longer answer

C# JavaScript , - . C# .NET IL , (, JITing, ). JavaScript , - JavaScript, . "" JavaScript, (, ).

( ), .NET - JITed - , JavaScript-, . . , , , , .

, , , . .

  • #, , ( ). - IL (, , JIT, .NET).
  • # , dynamic Reflection.Emit(), # , "" , .
  • JavaScript , JavaScript JIT . Firefox, , JIT.

" " ( , ) , "" . JIT-, , ( JITter , , , ). , # Java, , " JIT-" JITer .

, JavaScript , , , (, , ). , Firefox, , ( , JIT ).

, JavaScript . - JIT ( "" JIT), .

+5

Source: https://habr.com/ru/post/1629183/


All Articles