I am developing a small web-based utility that displays some data from some database tables.
I have a utility that works fine on FF, Safari, Chrome ... but memory management on IE8 is horrific. The largest JSON request will return information to create about 5000 rows of the table in the browser (3 columns in the table).
I use jQuery to get data (via getJSON). To delete an old / existing table, I just do $('#my_table_tbody').empty() . To add new information to the table, in the getJSON callback, I just add every row of the table that I create for the variable, and then, when I have everything, I use $('#my_table_tbody').append(myVar) to add it to an existing topic, I am not adding table rows as they are created because it seems to be much slower than just adding them all at once.
Does anyone have any recommendations on what anyone should do who is trying to add thousands of rows of data to the DOM? I would like to stay away from pagination, but I wonder if I have a choice.
Update 1 So, here is the code I tried after the innerHTML clause:
/ * Assuming a div called 'main_area' holds the table * /
document.getElementById ('main_area'). innerHTML = '';
$ .getJSON ("my_server", {my: JSON, args: are, in: here}, function (j) {
var mylength = j.length;
var k = 0;
var tmpText = '';
tmpText + = / * Add the table, thead stuff, and tbody tags here * /;
for (k = mylength - 1; k> = 0; k--)
{
/ * Qaru wont let me type greater than & less than signs here, so just assume that they are there. * /
tmpText + = 'tr class = "' + j [k] .row_class. '" td class = "col1_class"' + j [k] .col1 + '/ td td class = "col2_class"' + j [k]. col2 + '/ td td class = "col3_class"' + j [k] .col3 + '/ td / tr';
}
document.getElementById ('main_area'). innerHTML = tmpText;
}
That is the essence of this. I also tried using only the $ .get request, and the server sent the formatted HTML and just set it to innerHTML (i.e. document.getElementById('main_area').innerHTML = j; ).
Thanks for all the answers. I am on the fact that you are all ready to help.