I am porting some of my OpenGL code to WebGL, and the fact that JavaScript does not have genuine arrays is sad. I can use Float32Array (and other other types of ArrayBuffer ), but this does not seem to help performance.
As an experiment comparing the performance of Array vs Float32Array vs Float64Array , I assigned a sort of bubbles to 100,000 floats to see if there is a difference:
function bubbleSort(array) { var N = array.length; for (var i = 0; i < N; i++) for (var j = i; j < N-1; j++) if (array[j] > array[j+1]) { var tmp = array[j]; array[j] = array[j+1]; array[j+1] = tmp; } }
Not a big difference. In fact, the compiler would need static type information for the Array argument for bubbleSort in order to really get decent performance. Are we just stuck with poor array performance in JS? How to get around this? Except using ASM.js, which ...
source share